TS/SCI with Polygraph is required.
- Shall have at least 8 years of general experience in software development/engineering, including requirements analysis, software development, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution. (Note: A bachelor’s degree in computer science, engineering, mathematics or a related discipline may be substituted for 4 years of general experience.)
- Shall have at least 6 years of experience developing software with high level languages such as Java.
- Shall have at least 4 years of experience with distributed scalable Big Data Store (NoSQL) such as HBase, CloudBase/Accumulo, Big Table, etc., as well as 2 years of experience with the Map Reduce programming model, the Hadoop Distributed Files System (HDFS), and technologies such as Hadoop, Hive, Pig, etc.
- Shall have demonstrated work experience with Serialization such as JSON and/or BSON.
- Shall have demonstrated work experience with developing restful services, Ruby on Rails framework.
- Shall have at least 3 years of experience developing software in UNIX/Linux (Red Hat versions 3-5+) operating systems.
- Shall have demonstrated work experience in the design and development of at least one Object Oriented System.
- Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products.
- Shall have at least 3 years of experience in software integration and software testing, to include developing and implementing test plans and test scripts.
- Shall have demonstrated technical writing skills and shall have generated technical documents in support of software development projects.
- In addition, the candidate will have demonstrated experience, work or college level courses, in at least 2 of the desired characteristics.
- Shall have demonstrated work experience with Source Code Management (e.g. Git, Stash, or Subversion, etc.)
- Experience deploying applications in a cloud environment.
- Understanding of Cloud Scalability.
- Hadoop/Cloud Certification.
- Experience designing and developing automated analytic software, techniques, and algorithms.
- Experience developing and deploying analytics that include foreign language processing; analytic processes that incorporate/integrate multi-media technologies, including speech, text, image and video exploitation; analytics that function on massive data sets, for example, more than a billion rows or larger than 10 Petabytes; analytics that employ semantic relationships (i.e. inference engines) between structured and unstructured data sets; analytics that identify latent patterns between elements of massive data sets, for example more than a billion rows or larger than 10 Petabytes; analytics that employ techniques commonly associated with Artificial Intelligence for example algorithms.
- Experience with data formats/techniques such as ASDF, XML (Schema, XSL/T, XQuery), Streaming parsers (Stax or SAX, DOM), protobuf, or Avro
- Experience with taxonomy construction for analytic disciplines, knowledge areas and skills.
- Experience developing and deploying: data driven analytics, event driven analytics, sets of analytics orchestrated through rules engines.
- Experience with linguistics (grammar, morphology, concepts).
- Experience developing and deploying analytics that discover and exploit social networks.
- Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications.
- Experience developing and deploying analytics within a heterogeneous schema environment.
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed above are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
We are an Equal Opportunity Employer.