ALQIMI | Hiring | Data Engineer | BigDataKB.com | 1/11/2022

0

ALQIMI

Gurgaon

ALQIMI is a global data, software and IT services company based in Washington DC, USA with offices around the world. For more than 20 years, ALQIMI has operated in demanding government agency and commercial environments delivering a wide range of cutting-edge IT solutions enabling these organizations to surpass their missions and goals. ALQIMI’s domain expertise includes large-scale enterprise computing, healthcare IT, big data engineering and software development and artificial intelligence. ALQIMI was founded in 1997 and has maintained operations in Gurugram, India since 2004.

ALQIMI is currently seeking a Data Engineer to join its AOSEN development, engineering and operations team In India. AOSEN is the ALQIMI Open Standards Environmenta big data platform developed by ALQIMI as a complete end-to-end, system of- systems capable of open source data collections, all-source data fusion, analytics, artificial intelligence, visualizations and more. Successful implementations of AOSEN and AOSEN-powered applications have taken place with the United States Air Force, Shady Grove Fertility Clinic, IBM, Janes, and Bajaj Capital. AOSEN is based on open source technologies and the belief that the primary building blocks of a powerful big data platform should be developed internally. The selected candidate will be part of a 30+ person team of engineers, developers, analysts and scientists in Gurugram, India and Washington, DC, USA.

Roles and responsibilities of the Data Engineer:

  • Assemble large, complex data sets that meet functional/non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • In collaboration with managers, Define and Clarify Project Scope and Schedule.
  • Develop plan and procedures to support the achievement of the project objectives.
  • Lead discovery processes with project stakeholders to identify the business requirements and the expected outcome.
  • Create and maintain optimal data pipelines architecture.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, NoSQL, and other big data technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
  • Work with key stakeholders to assist with data-related technical issues and support their data infrastructure needs.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing the product in to an innovative industry leader.
  • Seek out new open data sets, create unique acquisition pipelines, and manage the flow of data in to an advanced big data platform.
  • Identifies what data is available and relevant, including internal and external data sources, leveraging new and open source data collection such as geo-location, and other open source data including social and news media.
  • Must list all the activities in Monday.com and allot time frames for their completion. Ensure all development comments and remarks are recorded in Monday.com.
  • Prepare and present written periodic project development progress to senior management.
  • Communicate with senior management and the project governance authorities (project board, steering committee, etc.) with the frequency and formality that is deemed necessary.
  • Provide a personal weekly activity report to senior management.
  • Managing risk and any issues as they arise, escalating to senior business or technical roles as required.
  • Ensure all testing and review activity is properly scheduled and carried out.
  • Should also be able to perform as individual contributor, write software code as needed, and lead as an example.
  • Support these activities in a business development environment with both presentations and white papers as needed.
  • Document all work, code, algorithms, and processes and be asked from time to time to author white papers.

Technical Skills:
MUST Have –

  • NoSQL databases – MongoDB, Cassandra, etc.
  • SQL – traditional RDBMS

GOOD To Have –

  • Core Java, Design patterns, etc.
  • Hadoop eco-systemMapReduce, Pig, Hive, Impala, HBase and Sqoop
  • Real-time processing in Spark using Kafka

Apply Here

Submit CV To All Data Science Job Consultants Across India For Free

🔍 Explore All Related ITSM Jobs Below! 🚀 ✅ Select your preferred "Job Category" in the Job Category Filter 🎯 🔎 Hit "Search" to find matching jobs 🔥 ➕ Click the "+" icon that appears just before the company name to see the Job Detail & Apply Link 📝💼 [wpdatatable id=68]

LEAVE A REPLY

Please enter your comment!
Please enter your name here