Principal Dataflow Systems Engineer
Apply NowCompany: Fuse Engineering
Location: Annapolis, MD 21401
Description:
**A SECURITY CLEARANCE AND POLYGRAPH ARE REQUIRED
The engineer shall be responsible for creating, sustaining, and troubleshooting complex operational data flows including data storage, data transport, data management, data security, data compliance, and knowledge store management.
Additional tasking includes working with the mission customer to perform exploratory data analysis on raw data to clean, enrich, transform and convert the raw data into the required formats. The contractor shall also be responsible for devising methods to improve existing operational data flow processing, distribution, and reliability.
Task Required Skills
Experience using the Unix CLI
Experience creating, managing, and troubleshooting complex operational data flows
Experience using Apache NiFi canvas to process and distribute data
Experience with Corporate data flow processes and tools
Experience with Corporate data security and compliance procedures and policies
Experience with the Atlassian Tool Suite (JIRA, Confluence, Bitbucket)
Task Desired Skills
General HPC technical knowledge regarding compute, network, memory, and storage components
Experience with the Elastic Stack (Elasticsearch/Kibana)
Experience with time-series visualization tools such as Grafana
Experience writing scripts using Bash/Python
Experience with IaC principles and automation tools such as Ansible (Puppet and SaltStack acceptable)
Requirements
Twenty (20) years' experience as a SE in programs and contracts of similar scope, type and complexity is required. Demonstrated experience in planning and leading Systems Engineering efforts is required. Bachelor's degree in System Engineering, Computer Science, Information Systems, Engineering Science, Engineering Management, or related discipline from an accredited college or university is required. Five (5) years of additional SE experience may be substituted for a bachelor's degree.
The engineer shall be responsible for creating, sustaining, and troubleshooting complex operational data flows including data storage, data transport, data management, data security, data compliance, and knowledge store management.
Additional tasking includes working with the mission customer to perform exploratory data analysis on raw data to clean, enrich, transform and convert the raw data into the required formats. The contractor shall also be responsible for devising methods to improve existing operational data flow processing, distribution, and reliability.
Task Required Skills
Experience using the Unix CLI
Experience creating, managing, and troubleshooting complex operational data flows
Experience using Apache NiFi canvas to process and distribute data
Experience with Corporate data flow processes and tools
Experience with Corporate data security and compliance procedures and policies
Experience with the Atlassian Tool Suite (JIRA, Confluence, Bitbucket)
Task Desired Skills
General HPC technical knowledge regarding compute, network, memory, and storage components
Experience with the Elastic Stack (Elasticsearch/Kibana)
Experience with time-series visualization tools such as Grafana
Experience writing scripts using Bash/Python
Experience with IaC principles and automation tools such as Ansible (Puppet and SaltStack acceptable)
Requirements
Twenty (20) years' experience as a SE in programs and contracts of similar scope, type and complexity is required. Demonstrated experience in planning and leading Systems Engineering efforts is required. Bachelor's degree in System Engineering, Computer Science, Information Systems, Engineering Science, Engineering Management, or related discipline from an accredited college or university is required. Five (5) years of additional SE experience may be substituted for a bachelor's degree.