Get more qualified candidates and reduce your costs with our guaranteed recruiting solutions

Talk to an expert today

1662 - CADE Engineer Systems Architect

Lanham, Maryland
US Citizenship

OVERVIEW:

We drive missions of consequence spanning the globe and extending to the farthest reaches of the galaxy. As the world’s leading mission capability integrator and transformative enterprise IT provider, we deliver trusted and highly differentiated national security solutions and technologies that keep people safe and secure. We serve as a valued partner to essential government agencies across the intelligence, space, cyber, defense, civilian, health, and state and local markets. Every day, our 22,000 employees do the can’t be done, solving the most daunting challenges facing our customers.

We are searching for a Consulting Engineer Systems Architect to join our Global Health and Financial Solutions Sector to support our IRS contract. This position will be located in Lanham, Maryland.  

GENERAL DUTIES:

  • Designs and develops system architectures and defines key capabilities and performance requirements in alignment with IRS Strategy objectives and technology roadmaps.
  • Evaluate systems and technologies through analyses of alternatives (AoAs), operational analyses (OAs) of business systems, and proofs of concepts (POCs) and recommend products and technologies to implement the enterprise data platform.
  • Defines design and technology maturity constraints of the system in accordance with customer specifications.
  • Develops thorough definition of system external interfaces.
  • Defines system implementation approach and operational concept
  • Ensures requirements are met and evaluates performance with customer.
  • Tools that will be used within this position and will also require experience with include the following:
  • Tier 1 - Mainframe AND IBM Mainframe
  • DB2
  • Assembler Language Code (ALC)
  • COBOL 74
  • ALC Macros
  • Job Control Lanuage (JCL)
  • SQL & NoSQL
  • Java, XML, JSON
  • Attachmate INFOConnect
  • Customer Information Control System (CICS)
  • DocIT
  • Endevor
  • IBM Rational Clear Case, Clear Quest, Doors Next Generation (DNG)
  • Informatica
  • JBoss
  • Job Control Language (JCL)

REQUIRED QUALIFICATIONS:

  • 14 years of experience with a Bachelor's in Science
  • Knowledge of data security best practices and information risk management practices.
  • Experience with Agile methodologies (Scrum, SAFe) and supporting tools (e.g., Jira, Advanced Roadmaps, etc.).
  • Familiarity with data storage and retrieval using Virtual Storage Access Method (VSAM) in IBM mainframe environments.
  • Extensive experience with Extraction, Transformation, and Loading (ETL) workflows using various data transformation formats (JSON, XML, CSV, etc.).
  • Ability to obtain a Moderate Risk Background Investigation (MBI) security clearance, U.S. citizenship or a minimum 3 years of residency as a Lawful Permanent Resident (LPR) in the U.S. required.
  • Experience with IMF to CADE 2/Individual Master File (IMF)
  • Experience with NAP to CADE 2/National Account Profile (NAP)
  • Experience with IMF 701EXEC to CADE 2/Executive Control Program for IMF Extract (701EXEC:IMF701EXEC)
  • Experience with CADE 2 TO ERIS/Enforcement Revenue Information System (ERIS), ACV 1.0/ACA Compliance Validation (ACAIS:ACV1), IMFOL/Individual Master File On Line (IMF:IMFOL), IPM/Integrated Production Model (IPM), and CADE 2 to NAP/National Account Profile (NAP)

DESIRED QUALIFICATIONS:

  • Active IRS Moderate Risk Background Investigation (MBI) clearance.
  • IRS experience with Experience with IRS’s Individual Master File (IMF) and Customer Account Data Engine 2 (CADE2)
  • AWS Certified Solutions Architect or AWS Certified Database – Specialty certification; alternatively, immersive experience working with several of the following AWS services: EC2, S3, RDS, Aurora, Redshift, DocumentDB, ElastiCache, DynamoDB, QLDB, Athena, Glue, and EMR.
  • Experience building microservice architecture (MSA) solutions using Docker and Kubernetes.
  • Experience developing adaptable data ingestion and enrichment pipelines to support new and evolving data sources, formats, and schemas.
  • Experience developing data ingestion workflows with Spark Streaming, Kafka Streams, or Flink.
  • 3+ years of experience using big data tools such as Hadoop, Spark, Kafka, and NiFi.
  • 3+ years of experience developing data solutions with Python.
  • Experience designing and implementing REST APIs and related integration services.
  • Familiarity with data lineage and provenance, data governance, and master data management (MDM).
  • Experience working with Databricks or Snowflake.
  • Experience on Proposals/Consulting

CLEARANCE:

  • US Citizenship