Deloitte is hiring for Data Engineer Role | Apply Now

JOB ROLE: Data Engineer

Location - Chennai, Tamil Nadu, India Gurugram, Haryana, India Hyderabad, Telangana, India Kolkata, West Bengal, India Mumbai, Maharashtra, India Pune, Maharashtra, India

JOB DESCRIPTION:

  • Develop solutions with an Agile Development team.
  • Define, produce, test, review, and debug solutions.
  • Create component-based features and micro-frontends.
  • Database development with Postgres
  • Create comprehensive unit test coverage in all layers.
  • Deploy solutions to Docker containers and Kubernetes.
  • Help build a team culture of autonomy and ownership.
  • Work with a Product Owner to refine stories into functional use cases and identify the work effort as tasks.
  • Participate in test case creation responsibilities and peer reviews prior to coding.
  • Review implementation plans of peers prior to their coding.
  • Demonstrate feature work at the end of each iteration.
  • Work from home when desired with infrequent visits to the office and limited travel for planning sessions.
  • Develop our ETL process to be a robust automated production quality solution and lead the implementation and delivery.
  • Peer with the application engineering team to ensure our data model fits the need of the solution while promoting best practices in its design from both a maintenance and performance perspective.
  • Peer with the data science team in understanding their needs for preparing large datasets for machine learning.
  • Assist the team with understanding the execution plan of poorly written queries. Help remediate performance problems by assisting the performance tuning of queries and/or refining the data model to meet the needs of the business.
  • Build data systems and pipelines.
  • Evaluate business needs and objectives.
  • Explore ways to enhance Product / pipeline.
  • Collaborate with Team
  • Showcase the Skills / Innovative ideas to Team biweekly.

Education Required:

  • B.Tech/M.Tech/MS/MBA

Skills Required:

  • Strong knowledge of Python & SQL.
  • Hands-on experience with SQL database design
  • Hands-on experience or Knowledge about Airflow.
  • Knowledge of Docker and Kubernetes
  • Experience with running containerized microservices.
  • Experience with Apache Spark or AWS EMR.
  • Experience with cloud platforms (AWS, Azure) with strong preference towards AWS.
  • Experience in Database design practices
  • Technical expertise with Data warehouse or Data Lake
  • Expertise in configuring and maintaining PostgreSQL.
  • Experience performance tuning queries and data models to produce the best execution plan.
  • Strong experience building data pipelines & ETL.
  • Experience working on an Agile Development team and delivering features incrementally.
  • Experience with Git repositories
  • Working knowledge of setting up builds and deployments
  • Experience with both Windows and Linux.
  • Experience demonstrating work to peers and stakeholders for acceptance
  • Ability to multi-task, be adaptable, and nimble within a team environment.
  • Strong communication, interpersonal, analytical and problem-solving skills.
  • Ability to communicate effectively with nontechnical stakeholders to define requirements.
  • Ability to quickly understand new client data environments and document the business logic that composes them.
  • Ability to integrate oneself into geographically dispersed teams and clients.
  • A passion for high quality software. Previous experience as a data engineer or in a similar role
  • Eagerness to learn and seek new frameworks, technologies, and languages
  • Commitment to working with others and sharing knowledge on a regular basis
 
Preferred skills
  • Experience working with Azure DevOps, JIRA or similar project tracking software.
  • Experience working in a startup environment
  • Experience with data streaming such as Apache Kafka, AWS kinesis, Spark Streaming, or similar tools.
  • Experience with many other big data technologies at scale. Experience with BigQuery or similar (Redshift, Snowflake, other MPP databases)
  • Knowledge of when to use NOSQL versus traditional RDBMS
  • Experience with RDS in AWS a big plus.
  • Kafka, RabbitMQ or similar queueing technologies a plus.
  • Experience with BI tools such as Tableau and Jaspersoft.

SALARY:10lpa EXPECTED

HOW TO CREATE YOUR RESUME

HOW TO CREATE NAUKRI.COM PROFILE

HOW TO CREATE LINKEDIN PROFILE