Data Engineer (Bid Role starting in 10 weeks) - TS/SCI w/ Poly
The Sponsor requires Data Engineering support to evaluate, optimize, and implement robust data infrastructure that enables reliable, accessible, and scalable data delivery across the organization. The Contractor will work collaboratively with data consumers, technical teams, leadership, and stakeholders to assess current data pipelines, identify gaps in data accessibility and reliability, and architect solutions that establish trusted data foundations. Work involves applying engineering best practices to implement proper data modeling and integration patterns, ensuring data quality and observability throughout pipelines, and creating maintainable infrastructure that supports analytics, reporting, and operational use cases. The Sponsor's data landscape includes enterprise operational systems such as ServiceNow, network management platforms (NetIM), and network modeling tools (Forward Networks). The Data Engineering support must be adept at extracting data from these systems via APIs (Application Programming Interface), exports, and vendor-specific interfaces, often with limited documentation or non-standard data structures, and transforming this operational data into accessible, integrated datasets. WORK REQUIREMENTS: • The Contractor shall conduct comprehensive assessments of existing data pipelines, infrastructure, and data flows including integrations with operational systems like ServiceNow, network management platforms, and business applications to identify technical debt, bottlenecks, and reliability issues. • The Contractor shall evaluate current data architecture against industry best practices and organizational needs; develop technical recommendations and roadmaps for data infrastructure improvements. • The Contractor shall design, build, and maintain production-grade data pipelines using orchestration tools such as Airflow or Prefect. • The Contractor shall develop robust ETL (Extract-Transform-Load) / ELT (Extract-Load-Transform) processes from diverse sources: SaaS platforms, network management systems, databases, APIs, files, and streams. • The Contractor shall build API integrations handling authentication (OAuth, API keys, and Single Sign-On (SSO)), rate limiting, pagination, retry logic, and error handling. • The Contractor shall extract data from systems not designed for export; reverse-engineer undocumented data structures and relationships. • The Contractor shall handle semi-structured data (JSON and XML); and transform into structured datasets with consistent schemas. • The Contractor shall design dimensional models, data warehouses, and data marts following industry methodologies. • The Contractor shall create conceptual, logical, and physical data models optimized for query performance and storage efficiency. • The Contractor shall implement slowly changing dimensions and other data warehousing patterns. • The Contractor shall establish naming conventions, data standards, and modeling best practices. • The Contractor shall implement comprehensive data quality checks, validation rules, and automated monitoring with alerting. • The Contractor shall build error handling, failure recovery, logging, and observability into all processes. • The Contractor shall optimize pipelines for performance, cost, and resource utilization. • The Contractor shall develop reusable components and frameworks; refactor legacy pipelines for reliability. • The Contractor shall build and maintain data infrastructure on cloud platforms (Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP)) using infrastructure-as-code using Terraform and CloudFormation. • The Contractor shall implement CI/CD pipelines, version control (Git), and automated testing frameworks. • The Contractor shall manage database performance tuning, indexing, partitioning, and capacity planning. • The Contractor shall establish backup, recovery, security controls, access controls, and compliance measures. • The Contractor shall partner with analysts, software developers, and business stakeholders to translate requirements into technical solutions. • The Contractor shall create comprehensive documentation for systems, processes, and integrations. • The Contractor shall provide technical guidance on data availability and proper usage; enable self-service access. • The Contractor shall troubleshoot pipeline failures, performance issues, and data discrepancies; perform root cause analysis. TRAVEL: Travel is anticipated for this contract: YES Travel is anticipated for this contractor upon Sponsor approval for the following categories: Local travel/POV will be on an as needed basis, within the local place of performance.
Required Skills:
1. Demonstrated experience building and managing data pipelines. Demonstrated experience with python. 2. Demonstrated experience with cloud computing using AWS services. Demonstrated experience processing data using Apache Spark. 3. Demonstrated experience with an RDBMS (Relational Database Management System) (Postgres, Oracle, MySQL) and writing SQL queries. 4. Demonstrated experience with Linux and shell scripting. 5. Demonstrated experience analyzing data in different file formats like csv, xml, json, avro, parquet, etc. 6. Demonstrated experience writing and validating unit tests.
Desired Skills:
1. Demonstrated experience in NiFi, Apache AirFlow, or an equivalent solution or tool for orchestrating data pipelines. 2. Demonstrated experience with java or scala. 3. Demonstrated experience administering an EMR/Spark cluster. 4. Demonstrated experience conducting performance tuning of a spark job. 5. Demonstrated experience supporting Hive, Iceberg, or another technology providing SQL access to data. 6. Demonstrated experience developing cloud-based security solutions. 7. Demonstrated experience following a configuration management process to review and deploy code as part of releases.
Requirements
Active TS/SCI w/ FS poly required prior to application
Required Skills:
1. Demonstrated experience building and managing data pipelines. Demonstrated experience with python. 2. Demonstrated experience with cloud computing using AWS services. Demonstrated experience processing data using Apache Spark. 3. Demonstrated experience with an RDBMS (Relational Database Management System) (Postgres, Oracle, MySQL) and writing SQL queries. 4. Demonstrated experience with Linux and shell scripting. 5. Demonstrated experience analyzing data in different file formats like csv, xml, json, avro, parquet, etc. 6. Demonstrated experience writing and validating unit tests.
Desired Skills:
1. Demonstrated experience in NiFi, Apache AirFlow, or an equivalent solution or tool for orchestrating data pipelines. 2. Demonstrated experience with java or scala. 3. Demonstrated experience administering an EMR/Spark cluster. 4. Demonstrated experience conducting performance tuning of a spark job. 5. Demonstrated experience supporting Hive, Iceberg, or another technology providing SQL access to data. 6. Demonstrated experience developing cloud-based security solutions. 7. Demonstrated experience following a configuration management process to review and deploy code as part of releases.
Benefits
Leading Path is an award-winning Information Technology and Management Consulting firm focused on providing solutions in process, technology, and operations to our government and Fortune 500 clients. We offer a professional and family friendly work environment with a strong work-life balance. Leading Path provides a comprehensive and competitive benefits package including fully paid medical/dental/vision premiums, generous PTO, 11 Paid Holidays, 6% 401K contribution, annual training and tuition reimbursement, SPOT Award bonuses, regular team events, opportunities for professional growth and advancement and much more!
Recommended Jobs
Automotive Technician / Apprentice
Automotive Technician / Apprentice Program — Williams Auto Service Location: Richmond, VA Job Type: Full-time Experience Level: C–B Level Technician Are you ready to take you…
Pipefitter
Job Description Job Description PRIMARY FUNCTION: Layout, installs, maintains, and repairs all types of piping systems for maintenance, tank and new construction. TYPICAL DUTIES: Activ…
Senior Planner
LaBella Associates’ Planning Group seeks a Senior Planner to join our Richmond, Virginia office. Our planning team partners closely with municipalities to provide planning and zoning support that str…
Price to Win Analyst
Title: Price to Win Analyst KBR's Mission Technologies Solutions (MTS) business segment is building an internal Competitive Intelligence and Price to Win capability. The new team will collabora…
Business Process Engineer, RPA - Active TS/SCI Clearance
Location: On-Site (DMV area preferred) Clearance: Active TS/SCI Required Employment Type: Full-time / Direct Hire Salary: : $140,000–$180,000/yr. We are seeking a highly analytical …
Catering Driver - Virginia State University
Overview: 1. Job Summary The Driver is responsible for transporting food and supplies to various dining locations across campus. This role includes preparing orders for delivery, loading and unload…
Systems Engineer
Job Description Job Description J5 Consulting is a Maryland based company established in 2006 to provide computing and consulting services for government and commercial entities. Our services imp…
PT Sales Associate - Tysons Galleria
Overview: Marc Jacobs International, powered by the creative genius of Marc Jacobs, seeks a Sales Associate to join its Tysons Galleria store in McLean, Virginia. As a Sales Associate at Marc Ja…
Elasticsearch Developer
Bright Vision Technologies is a forward-thinking software development company dedicated to building innovative solutions that help businesses automate and optimize their operations. We leverage cut…
Imagery Scientist - EO Senior
Position Overview The Imagery Scientist – EO Senior supports operational EO exploitation activities, imagery quality assessments, and analytical mission support functions. The selected candidate w…