• Software Engineer II MDM

    Job Locations US-IA-Iowa City
    Posted Date 2 weeks ago(4/10/2018 9:27 AM)
    Job ID
    2018-1210
    # of Openings
    1
    Category
    Information Technology (Development)
    Travel
    Up to 25% Travel
  • Overview

    ACT is a nonprofit organization helping people achieve educational and workplace success.  Our programs are designed to boost lifelong learning in schools and workplaces around the world. Whether it's guiding students along their learning paths, enabling companies to develop their workforce, fostering parent, teacher, and counselor understanding of student progress, guiding job seekers toward career success, or informing policymakers about education and workforce issues. ACT is passionate about making a difference in all we do.

     

    Learn more about working at ACT at act.org!

    Responsibilities

    The Software Engineer II for the MDM team is responsible for the design, development and support of master data management solutions that integrate with transactional systems using various tools such as Informatica MDM and RESTful API’s. The job includes development in Java, integrating with Bitbucket for source control, and Jenkins for builds. This role will participate in the Agile process and ensure product quality, robustness, scalability, and consistency.

    Work-related activities include:

    • Develop, customize, support and configure master data management solutions to ensure accurate student, organization, address and reference data.
    • Independently design, maintain, and enhance applications capitalizing on existing frameworks and reusable components
    • Independently collaborate with Scrum team to support all development activities
    • Proactively own development activities related to Scrum team execution
    • Actively participate in design/architectural discussions, grooming user stories, sprint demos, and daily Scrums to help establish a Behavior-Driven Development (BDD) approach
    • Develop web services and other interfaces to respond and/or send mastered records between transactional systems and MDM platforms.
    • Monitor product reliability, consistency, and performance
    • Recommend product, infrastructure, or design changes to ensure quality and performance standards

     

    Qualifications

    Minimum Qualifications:

    Education:

    • Bachelor’s degree in a related area required; preferably in computer science, mathematics, statistics or information systems
    • Or an equivalent combination of education and experience from which comparable knowledge and abilities can be acquired.

     

    Experience:

    • Minimum of three years of progressive IT experience that includes coding in Java or Python required
    • Minimum of two years SQL query experience for data analysis/profiling.
    • Experience developing applications utilizing REST API web services, Apache Nifi, or Kafka preferred
    • Experience working in an Agile development environment preferred

     

    Knowledge, Skills and Abilities:

    Required:

    • Strong understanding of Java or Python
    • Strong understanding of RESTful API’s
    • Strong knowledge of version control such as Git / Bitbucket and Jenkins for builds
    • Demonstrated ability to learn new technologies quickly and effectively
    • Demonstrated relational and no-SQL database skills (MongoDB, MySQL and/or Oracle)
    • Ability to work as part of a Scrum team
    • Strong verbal and written communication skills
    • Results-oriented and able to work across the organization
    • Comfortable working in a rapidly transforming organization
    •  

    Preferred:

    • Knowledge of project tacking software like Jira
    • Knowledge of test automation frameworks such as Robot
    • Understanding of performance testing frameworks such as Jmeter and Blazemeter
    • Understanding of static code analysis tools such as SonarQube, HPFortify, Crucible
    • Understanding of production monitoring solutions such as NewRelic, Tivoli, App Dynamics, SCOM and synthetic transaction scripting
    • Knowledge of production monitoring solutions such as Splunk and NewRelic
    • Strong understanding of Jira
    • Knowledge in Big Data technologies like Spark/Kafka/Hive/Apache Nifi/HDFS

     

    Options

    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed

    Connect With Us!

    Not ready to apply? Connect with us for general consideration.