Bachelor’s or Master's degree in Computer Science, Information Technology, or allied streams.
3+ years of hands-on experience in the data engineering domain with DWH development.
Must have experience with end-to-end data warehouse implementation on Azure or GCP.
Must have SQL and PL/SQL, implementing complex queries and stored procedures.
Solid understanding of DWH concepts such as OLAP, ETL/ELT, RBAC, Data Modelling, Data Driven Pipelines, Virtual Warehousing, and MPP.
Expertise in Databricks - Structured Streaming, Lakehouse Architecture, DLT, Data Modeling, Vacuum, Time Travel, Security, Monitoring, Dashboards, DBSQL, and Unit Testing.
Expertise in Snowflake - Monitoring, RBACs, Virtual Warehousing, Query Performance Tuning, and Time Travel.
Understanding of Apache Spark, Airflow, Hudi, Iceberg, Nessie, NiFi, Luigi, and Arrow (Good to have).
Strong foundations in computer science, data structures, algorithms, and programming logic.
Excellent logical reasoning and data interpretation capability.
Ability to interpret business requirements accurately.
Exposure to work with multicultural international customers.