Noodle.ai
Bangalore
Enterprise Software & Network Solutions
Noodle’s Data Engineers have a strong understanding of database structures, modeling and data warehousing techniques; know how to create SQL queries, stored procedures, views and define best practices for engineering scalable secure data pipelines. Members of our Data Engineering team are passionate about embracing the challenge of dealing with petabyte or even exabytes of data on a daily basis.
Roles and Responsibilities:-
- Contribute to the data engineering development work
- Collaboration with multiple stake holders including and not limited to the Infrastructure, DevOps, Data science among other team
- Interface with customer facing teams
- Support and monitor multiple data pipelines across different customers
- Work closely with the development teams to understand changes to every release
Qualifications:
Must haves:
- 2+ years of experience with data pipelining and development
- BE/B.Tech or Advanced degree in a relevant field (Computer Science and Engineering, Technology and related fields)
- Good knowledge on Python programming language
- Good knowledge in SQL queries (Not limited to PostgreSQL)
- Experience in data pipeline orchestration tools like Airflow
- Basic understanding of containers and familiarity with docker commands
- Hands on experience of working with distributed data systems such as Spark,Hive, HDFS
- Very good debugging skills
- Flexible to learn new technologies and adapt to dynamic environment
Nice to haves
- Exposure to cloud preferably AWS, S3 , EMR
- Experience on processing large scale time series data would be preferred
- Exposure to kubernetes
- Understanding of ML model lifecycle and pipelines
- Experience with (and excitement for) interdisciplinary collaboration
WANT TO HELP SHAPE THE FUTURE OF ENTERPRISE ARTIFICIAL INTELLIGENCE®? LET’S NOODLE.
Submit CV To All Data Science Job Consultants Across India For Free


