TS/SCI with Polygraph is Required.
- Shall have at least ten (10) years’ experience in software development/engineering, including requirements analysis, software development, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution.
- Shall have at least six (6) years of experience developing software using the Java programming language. At least four ( 4) years ofthis experience must have been obtained in the last seven (7) years;
- Shall have demonstrated work experience with distributed scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc.
- Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.
- Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS)
- Shall have demonstrated work experience with serialization such as JSON and/or BSON
- Shall have at least three (3) years’ experience developing software for Windows (2000, 2003, XP, Vista), or UNIX/Linux (Redhat versions 3-5) operating systems.
- Shall have demonstrated work experience in the design and development of at least one Object Oriented system.
- Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products.
- Shall have at least three (3) years’ experience in software integration and software testing, to include developing and implementing test plans and test scripts.
- Shall have demonstrated technical writing skills and shall have generated technical documents in support of a software development project.
- In addition, the candidate will have demonstrated work experience in at least four (4) of the desired characteristics.
- At least four( 4) years of experience of either C or C++ Windows/Linux programming;
- Experience deploying applications in a cloud environment.
- Understanding of Cloud Scalability
- Hadoop I Cloud Certification
- Experience designing and developing automated analytic software, techniques, and algorithms.
- Experience developing and deploying: analytics that include foreign language processing; analytic processes that incorporate/integrate multi-media technologies, including speech, text, image and video exploitation; analytics that function on massive data sets, for example, more than a billion rows or larger than 10 Petabytes; analytics that employ semantic relationships (i.e., inference engines) between structured and unstructured data sets; analytics that identify latent patterns between elements of massive data sets, for example more than a billion rows or larger than 10 Petabytes; analytics that employ techniques commonly associated with Artificial Intelligence, for example genetic algorithms.
- Experience with taxonomy construction for analytic disciplines, knowledge areas and skills.
- Experience developing and deploying: data driven analytics; event driven analytics; sets of analytics orchestrated through rules engines.
- Experience with linguistics (grammar, morphology, concepts).
- Experience developing and deploying analytics that discover and exploit social networks.
- Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications.
- Experience developing and deploying analytics within a heterogeneous schema environment.
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed above are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
We are an Equal Opportunity Employer.