Oracle Data Integrator (ODI) Loactions: Bangalore, Hyderabad, Chennai, Mumbai, Pune, Gurgaon, Kolkata EXPERIENCE : 4+ Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. So, what impact will you make? Visit us @ https://hashedin.com JOB TITLE: Software Developer II: Oracle Data Integrator (ODI) OVERVIEW OF THE ROLE: We are looking for an experienced Oracle Data Integrator (ODI) and Oracle Analytics Cloud (OAC) Consultant to join our dynamic team. You will be responsible for designing, implementing, and optimizing cutting-edge data integration and analytics solutions. Your contributions will be pivotal in enhancing data-driven decision-making and delivering actionable insights across the organization. HASHEDIN BY DELOITTE 2025 Key Responsibilities: ¢ Develop robust data integration solutions using Oracle Data Integrator (ODI). Create, optimize, and maintain ETL/ELT workflows and processes. Configure and manage Oracle Analytics Cloud (OAC) to provide interactive dashboards and advanced analytics. ¢ ¢ Integrate and transform data from various sources to generate meaningful insights using OAC. Monitor and troubleshoot data pipelines and analytics solutions to ensure optimal performance. ¢ ¢ ¢ Ensure data quality, accuracy, and integrity across integration and reporting systems. Provide training and support to end-users for OAC and ODI solutions. Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Technical Skills: ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ Expertise in ODI components such as Topology, Designer, Operator, and Agent. Experience in Java and weblogic development. Proficiency in developing OAC dashboards, reports, and KPIs. Strong knowledge of SQL and PL/SQL for advanced data manipulation. Familiarity with Oracle databases and Oracle Cloud Infrastructure (OCI). Experience in data modeling and designing data warehouses. Strong analytical and problem-solving abilities. Excellent communication and client-facing skills. Hands-on, end to end DWH Implementation experience using ODI. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. ¢ ¢ ¢ ¢ Expertise in the Oracle ODI tool set and Oracle PL/SQL,ODI knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Setting up topology, building objects in Designer, Monitoring Operator, different type of KMs, Agents etc ¢ ¢ Packaging components, database operations like Aggregate pivot, union etc. using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ Design and develop complex mappings, Process Flows and ETL scripts Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Ability to design data quality and reconciliation framework using ODI Integrate ODI with multiple Source / Target Experience on Error recycling / management using ODI,PL/SQL Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. © HASHEDIN BY DELOITTE 2025 ¢ Experience of creating PL/SQL packages, procedures, Functions , Triggers, views, Mat Views and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle ¢ ¢ ¢ ¢ ¢ Experience in Data Migration using SQL loader, import/export Experience in SQL tuning and optimization using explain plan and SQL trace files. Strong knowledge of ELT/ETL concepts, design and coding Partitioning and Indexing strategy for optimal performance. Should have experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High and Low level design documents. ¢ Ability to work with minimal guidance or supervision in a time critical environment. Experience: ¢ ¢ ¢ 4-6 Years of overall experience in Industry 3+ years of experience with Oracle Data Integrator (ODI) in data integration projects. 2+ years of hands-on experience with Oracle Analytics Cloud (OAC). Preferred Skills: ¢ Knowledge of Oracle Autonomous Data Warehouse (ADW) and Oracle Integration Cloud (OIC). ¢ ¢ ¢ Familiarity with other analytics tools like Tableau or Power BI. Experience with scripting languages such as Python or shell scripting. Understanding of data governance and security best practices. Educational Qualifications: ¢ Bachelors degree in Computer Science, Information Technology, Engineering, or related field. © HASHEDIN BY DELOITTE 2025
JOB TITLE: Software Developer II: Oracle Data Integrator (ODI) OVERVIEW OF THE ROLE: We are looking for an experienced Oracle Data Integrator (ODI) and Oracle Analytics Cloud (OAC) Consultant to join our dynamic team. You will be responsible for designing, implementing, and optimizing cutting-edge data integration and analytics solutions. Your contributions will be pivotal in enhancing data-driven decision-making and delivering actionable insights across the organization. Key Responsibilities: Develop robust data integration solutions using Oracle Data Integrator (ODI). Create, optimize, and maintain ETL/ELT workflows and processes. Configure and manage Oracle Analytics Cloud (OAC) to provide interactive dashboards and advanced analytics. Integrate and transform data from various sources to generate meaningful insights using OAC. Monitor and troubleshoot data pipelines and analytics solutions to ensure optimal performance. Ensure data quality, accuracy, and integrity across integration and reporting systems. Provide training and support to end-users for OAC and ODI solutions. Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Technical Skills: Expertise in ODI components such as Topology, Designer, Operator, and Agent. Experience in Java and weblogic development. Proficiency in developing OAC dashboards, reports, and KPIs. Strong knowledge of SQL and PL/SQL for advanced data manipulation. Familiarity with Oracle databases and Oracle Cloud Infrastructure (OCI). Experience in data modeling and designing data warehouses. Strong analytical and problem-solving abilities. Excellent communication and client-facing skills. Hands-on, end to end DWH Implementation experience using ODI. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. Expertise in the Oracle ODI tool set and Oracle PL/SQL,ODI knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Setting up topology, building objects in Designer, Monitoring Operator, different type of KMs, Agents etc Packaging components, database operations like Aggregate pivot, union etc. using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Ability to design data quality and reconciliation framework using ODI Integrate ODI with multiple Source / Target Experience on Error recycling / management using ODI,PL/SQL Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. HASHEDIN BY DELOITTE 2025 Experience of creating PL/SQL packages, procedures, Functions , Triggers, views, Mat Views and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in Data Migration using SQL loader, import/export Experience in SQL tuning and optimization using explain plan and SQL trace files. Strong knowledge of ELT/ETL concepts, design and coding Partitioning and Indexing strategy for optimal performance. Should have experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High and Low level design documents. Ability to work with minimal guidance or supervision in a time critical environment. Experience: 4-6 Years of overall experience in Industry 3+ years of experience with Oracle Data Integrator (ODI) in data integration projects. 2+ years of hands-on experience with Oracle Analytics Cloud (OAC). Preferred Skills: Knowledge of Oracle Autonomous Data Warehouse (ADW) and Oracle Integration Cloud (OIC). Familiarity with other analytics tools like Tableau or Power BI. Experience with scripting languages such as Python or shell scripting. Understanding of data governance and security best practices. Educational Qualifications: Bachelor’s degree in Computer Science, Information Technology, Engineering, or related field. ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance – HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. So, what impact will you make? Visit us @ https://hashedin.com
JOB TITLE: Software Developer II: Oracle Data Integrator (ODI)OVERVIEW OF THE ROLE:We are looking for an experienced Oracle Data Integrator (ODI) and Oracle Analytics Cloud(OAC) Consultant to join our dynamic team. You will be responsible for designing, implementing,and optimizing cutting-edge data integration and analytics solutions. Your contributions will bepivotal in enhancing data-driven decision-making and delivering actionable insights across theorganization.Key Responsibilities: Develop robust data integration solutions using Oracle Data Integrator (ODI). Create, optimize, and maintain ETL/ELT workflows and processes. Configure and manage Oracle Analytics Cloud (OAC) to provide interactive dashboards and advanced analytics. Integrate and transform data from various sources to generate meaningful insights using OAC. Monitor and troubleshoot data pipelines and analytics solutions to ensure optimal performance. Ensure data quality, accuracy, and integrity across integration and reporting systems. Provide training and support to end-users for OAC and ODI solutions. Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debuggingof applications.Technical Skills: Expertise in ODI components such as Topology, Designer, Operator, and Agent. Experience in Java and weblogic development. Proficiency in developing OAC dashboards, reports, and KPIs. Strong knowledge of SQL and PL/SQL for advanced data manipulation. Familiarity with Oracle databases and Oracle Cloud Infrastructure (OCI). Experience in data modeling and designing data warehouses. Strong analytical and problem-solving abilities. Excellent communication and client-facing skills. Hands-on, end to end DWH Implementation experience using ODI. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization,workflow design, etc. Expertise in the Oracle ODI tool set and Oracle PL/SQL,ODI knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Setting up topology, building objects in Designer, Monitoring Operator, different type of KMs, Agents etc Packaging components, database operations like Aggregate pivot, union etc. using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Ability to design data quality and reconciliation framework using ODI Integrate ODI with multiple Source / Target Experience on Error recycling / management using ODI,PL/SQL Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. HASHEDIN BY DELOITTE 2025 Experience of creating PL/SQL packages, procedures, Functions , Triggers, views, Mat Views and exception handling for retrieving, manipulating, checking and migratingcomplex datasets in oracle Experience in Data Migration using SQL loader, import/export Experience in SQL tuning and optimization using explain plan and SQL trace files. Strong knowledge of ELT/ETL concepts, design and coding Partitioning and Indexing strategy for optimal performance. Should have experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High and Lowlevel design documents. Ability to work with minimal guidance or supervision in a time critical environment. Experience: 4-6 Years of overall experience in Industry 3+ years of experience with Oracle Data Integrator (ODI) in data integration projects. 2+ years of hands-on experience with Oracle Analytics Cloud (OAC). Preferred Skills: Knowledge of Oracle Autonomous Data Warehouse (ADW) and Oracle Integration Cloud (OIC). Familiarity with other analytics tools like Tableau or Power BI. Experience with scripting languages such as Python or shell scripting. Understanding of data governance and security best practices. Educational Qualifications: Bachelors degree in Computer Science, Information Technology, Engineering, or related field.ABOUT HASHEDINWe are software engineers who solve business problems with a Product Mindset for leadingglobal organizations.By combining engineering talent with business insight, we build software and products that cancreate new enterprise value.The secret to our success is a fast-paced learning environment, an extreme ownership spirit,and a fun culture.WHY SHOULD YOU JOIN US?With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn,your work will make an impact that matters.So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion,collaboration, and high performance – HashedIn is the place to be!From learning to leadership, this is your chance to take your software engineering career to thenext level.So, what impact will you make?Visit us @ https://hashedin.com
POSITION Senior Data Engineer / Data Engineer LOCATION Bangalore/Mumbai/Kolkata/Gurugram/Hyd/Pune/Chennai EXPERIENCE 2+ Years JOB TITLE: Senior Data Engineer / Data Engineer OVERVIEW OF THE ROLE: As a Data Engineer or Senior Data Engineer, you will be hands-on in architecting, building, and optimizing robust, efficient, and secure data pipelines and platforms that power business-critical analytics and applications. You will play a central role in the implementation and automation of scalable batch and streaming data workflows using modern big data and cloud technologies. Working within cross-functional teams, you will deliver well-engineered, high-quality code and data models, and drive best practices for data reliability, lineage, quality, and security. HASHEDIN BY DELOITTE 2025 Mandatory Skills: Hands-on software coding or scripting for minimum 3 years Experience in product management for at-least 2 years Stakeholder management experience for at-least 3 years Experience in one amongst GCP, AWS or Azure cloud platform Key Responsibilities: Design, build, and optimize scalable data pipelines and ETL/ELT workflows using Spark (Scala/Python), SQL, and orchestration tools (e.g., Apache Airflow, Prefect, Luigi). Implement efficient solutions for high-volume, batch, real-time streaming, and event-driven data processing, leveraging best-in-class patterns and frameworks. Build and maintain data warehouse and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift) to support analytics, data science, and BI workloads. Develop, automate, and monitor Airflow DAGs/jobs on cloud or Kubernetes, following robust deployment and operational practices (CI/CD, containerization, infra-as-code). Write performant, production-grade SQL for complex data aggregation, transformation, and analytics tasks. Ensure data quality, consistency, and governance across the stack, implementing processes for validation, cleansing, anomaly detection, and reconciliation. Collaborate with Data Scientists, Analysts, and DevOps engineers to ingest, structure, and expose structured, semi-structured, and unstructured data for diverse use-cases. Contribute to data modeling, schema design, data partitioning strategies, and ensure adherence to best practices for performance and cost optimization. Implement, document, and extend data lineage, cataloging, and observability through tools such as AWS Glue, Azure Purview, Amundsen, or open-source technologies. Apply and enforce data security, privacy, and compliance requirements (e.g., access control, data masking, retention policies, GDPR/CCPA). Take ownership of end-to-end data pipeline lifecycle: design, development, code reviews, testing, deployment, operational monitoring, and maintenance/troubleshooting. Contribute to frameworks, reusable modules, and automation to improve development efficiency and maintainability of the codebase. Stay abreast of industry trends and emerging technologies, participating in code reviews, technical discussions, and peer mentoring as needed. Skills & Experience: Proficiency with Spark (Python or Scala), SQL, and data pipeline orchestration (Airflow, Prefect, Luigi, or similar). Experience with cloud data ecosystems (AWS, GCP, Azure) and cloud-native services for data processing (Glue, Dataflow, Dataproc, EMR, HDInsight, Synapse, etc.). © HASHEDIN BY DELOITTE 2025 Hands-on development skills in at least one programming language (Python, Scala, or Java preferred); solid knowledge of software engineering best practices (version control, testing, modularity). Deep understanding of batch and streaming architectures (Kafka, Kinesis, Pub/Sub, Flink, Structured Streaming, Spark Streaming). Expertise in data warehouse/lakehouse solutions (Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse) and storage formats (Parquet, ORC, Delta, Iceberg, Avro). Strong SQL development skills for ETL, analytics, and performance optimization. Familiarity with Kubernetes (K8s), containerization (Docker), and deploying data pipelines in distributed/cloud-native environments. Experience with data quality frameworks (Great Expectations, Deequ, or custom validation), monitoring/observability tools, and automated testing. Working knowledge of data modeling (star/snowflake, normalized, denormalized) and metadata/catalog management. Understanding of data security, privacy, and regulatory compliance (access management, PII masking, auditing, GDPR/CCPA/HIPAA). Familiarity with BI or visualization tools (PowerBI, Tableau, Looker, etc.) is an advantage but not core. Previous experience with data migrations, modernization, or refactoring legacy ETL processes to modern cloud architectures is a strong plus. Bonus: Exposure to open-source data tools (dbt, Delta Lake, Apache Iceberg, Amundsen, Great Expectations, etc.) and knowledge of DevOps/MLOps processes. Professional Attributes: Strong analytical and problem-solving skills; attention to detail and commitment to code quality and documentation. Ability to communicate technical designs and issues effectively with team members and stakeholders. Proven self-starter, fast learner, and collaborative team player who thrives in dynamic, fast-paced environments. Passion for mentoring, sharing knowledge, and raising the technical bar for data engineering practices. Desirable Experience: Contributions to open source data engineering/tools communities. Implementing data cataloging, stewardship, and data democratization initiatives. Hands-on work with DataOps/DevOps pipelines for code and data. Knowledge of ML pipeline integration (feature stores, model serving, lineage/monitoring integration) is beneficial. © HASHEDIN BY DELOITTE 2025 EDUCATIONAL QUALIFICATIONS: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience). Certifications in cloud platforms (AWS, GCP, Azure) and/or data engineering (AWS Data Analytics, GCP Data Engineer, Databricks). Experience working in an Agile environment with exposure to CI/CD, Git, Jira, Confluence, and code review processes. Prior work in highly regulated or large-scale enterprise data environments (finance, healthcare, or similar) is a plus.
Software Developer II: Oracle Data Integrator (ODI) Bangalore, Hyderabad, Chennai, Mumbai, Pune, Gurgaon, Kolkata LOCATION EXPERIENCE 2-5 years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. So, what impact will you make? Visit us @ https://hashedin.com JOB TITLE: Software Developer II: Oracle Data Integrator (ODI) OVERVIEW OF THE ROLE: We are looking for an experienced Oracle Data Integrator (ODI) and Oracle Analytics Cloud (OAC) Consultant to join our dynamic team. You will be responsible for designing, implementing, and optimizing cutting-edge data integration and analytics solutions. Your contributions will be pivotal in enhancing data-driven decision-making and delivering actionable insights across the organization. HASHEDIN BY DELOITTE 2025 Key Responsibilities: ¢ Develop robust data integration solutions using Oracle Data Integrator (ODI). Create, optimize, and maintain ETL/ELT workflows and processes. Configure and manage Oracle Analytics Cloud (OAC) to provide interactive dashboards and advanced analytics. ¢ ¢ Integrate and transform data from various sources to generate meaningful insights using OAC. Monitor and troubleshoot data pipelines and analytics solutions to ensure optimal performance. ¢ ¢ ¢ Ensure data quality, accuracy, and integrity across integration and reporting systems. Provide training and support to end-users for OAC and ODI solutions. Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Technical Skills: ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ Expertise in ODI components such as Topology, Designer, Operator, and Agent. Experience in Java and weblogic development. Proficiency in developing OAC dashboards, reports, and KPIs. Strong knowledge of SQL and PL/SQL for advanced data manipulation. Familiarity with Oracle databases and Oracle Cloud Infrastructure (OCI). Experience in data modeling and designing data warehouses. Strong analytical and problem-solving abilities. Excellent communication and client-facing skills. Hands-on, end to end DWH Implementation experience using ODI. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. ¢ ¢ ¢ ¢ Expertise in the Oracle ODI tool set and Oracle PL/SQL,ODI knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Setting up topology, building objects in Designer, Monitoring Operator, different type of KMs, Agents etc ¢ ¢ Packaging components, database operations like Aggregate pivot, union etc. using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects ¢ ¢ ¢ ¢ ¢ ¢ ¢ ¢ Design and develop complex mappings, Process Flows and ETL scripts Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Ability to design data quality and reconciliation framework using ODI Integrate ODI with multiple Source / Target Experience on Error recycling / management using ODI,PL/SQL Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. © HASHEDIN BY DELOITTE 2025 ¢ Experience of creating PL/SQL packages, procedures, Functions , Triggers, views, Mat Views and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle ¢ ¢ ¢ ¢ ¢ Experience in Data Migration using SQL loader, import/export Experience in SQL tuning and optimization using explain plan and SQL trace files. Strong knowledge of ELT/ETL concepts, design and coding Partitioning and Indexing strategy for optimal performance. Should have experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High and Low level design documents. ¢ Ability to work with minimal guidance or supervision in a time critical environment. Experience: ¢ ¢ ¢ 4-6 Years of overall experience in Industry 3+ years of experience with Oracle Data Integrator (ODI) in data integration projects. 2+ years of hands-on experience with Oracle Analytics Cloud (OAC). Preferred Skills: ¢ Knowledge of Oracle Autonomous Data Warehouse (ADW) and Oracle Integration Cloud (OIC). ¢ ¢ ¢ Familiarity with other analytics tools like Tableau or Power BI. Experience with scripting languages such as Python or shell scripting. Understanding of data governance and security best practices. Educational Qualifications: ¢ Bachelors degree in Computer Science, Information Technology, Engineering, or related field. © HASHEDIN BY DELOITTE 2025
POSITION Senior Data Engineer / Data Engineer Bangalore/Mumbai/Kolkata/Gurugra m/Hyd/Pune/Chennai LOCATION EXPERIENCE 2+ Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. So, what impact will you make? Visit us @ https://hashedin.com JOB TITLE: Senior Data Engineer / Data Engineer OVERVIEW OF THE ROLE: As a Data Engineer or Senior Data Engineer, you will be hands-on in architecting, building, and optimizing robust, efficient, and secure data pipelines and platforms that power business- critical analytics and applications. You will play a central role in the implementation and automation of scalable batch and streaming data workflows using modern big data and cloud technologies. Working within cross-functional teams, you will deliver well-engineered, high- quality code and data models, and drive best practices for data reliability, lineage, quality, and security. HASHEDIN BY DELOITTE 2025 Mandatory Skills: ¢ ¢ ¢ Hands-on software coding or scripting for minimum 3 years Experience in product management for at-least 2 years Stakeholder management experience for at-least 3 years Experience in one amongst GCP, AWS or Azure cloud platform Key Responsibilities: ¢ Design, build, and optimize scalable data pipelines and ETL/ELT workflows using Spark (Scala/Python), SQL, and orchestration tools (e.g., Apache Airflow, Prefect, Luigi). ¢ Implement efficient solutions for high-volume, batch, real-time streaming, and event- driven data processing, leveraging best-in-class patterns and frameworks. Build and maintain data warehouse and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift) to support analytics, data science, and BI workloads. Develop, automate, and monitor Airflow DAGs/jobs on cloud or Kubernetes, following robust deployment and operational practices (CI/CD, containerization, infra-as-code). Write performant, production-grade SQL for complex data aggregation, transformation, and analytics tasks. Ensure data quality, consistency, and governance across the stack, implementing processes for validation, cleansing, anomaly detection, and reconciliation. Collaborate with Data Scientists, Analysts, and DevOps engineers to ingest, structure, and expose structured, semi-structured, and unstructured data for diverse use-cases. Contribute to data modeling, schema design, data partitioning strategies, and ensure adherence to best practices for performance and cost optimization. Implement, document, and extend data lineage, cataloging, and ¢ ¢ ¢ ¢ ¢ ¢ ¢ observability through tools such as AWS Glue, Azure Purview, Amundsen, or open-source technologies. ¢ Apply and enforce data security, privacy, and compliance requirements (e.g., access control, data masking, retention policies, GDPR/CCPA). Take ownership of end-to-end data pipeline lifecycle: design, development, code reviews, testing, deployment, operational monitoring, and maintenance/troubleshooting. Contribute to frameworks, reusable modules, and automation to improve development efficiency and maintainability of the codebase. Stay abreast of industry trends and emerging technologies, participating in code ¢ ¢ ¢ reviews, technical discussions, and peer mentoring as needed. Skills & Experience: ¢ Proficiency with Spark (Python or Scala), SQL, and data pipeline orchestration (Airflow, Prefect, Luigi, or similar). Experience with cloud data ecosystems (AWS, GCP, Azure) and cloud-native services ¢ for data processing (Glue, Dataflow, Dataproc, EMR, HDInsight, Synapse, etc.). © HASHEDIN BY DELOITTE 2025 ¢ Hands-on development skills in at least one programming language (Python, Scala, or Java preferred); solid knowledge of software engineering best practices (version control, testing, modularity). ¢ Deep understanding of batch and streaming architectures (Kafka, Kinesis, Pub/Sub, Flink, Structured Streaming, Spark Streaming). Expertise in data warehouse/lakehouse solutions (Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse) and storage formats (Parquet, ORC, Delta, Iceberg, Avro). ¢ ¢ Strong SQL development skills for ETL, analytics, and performance optimization. ¢ Familiarity with Kubernetes (K8s), containerization (Docker), and deploying data pipelines in distributed/cloud-native environments. Experience with data quality frameworks (Great Expectations, Deequ, or custom validation), monitoring/observability tools, and automated testing. Working knowledge of data modeling (star/snowflake, normalized, denormalized) and metadata/catalog management. Understanding of data security, privacy, and regulatory compliance (access management, PII masking, auditing, GDPR/CCPA/HIPAA). Familiarity with BI or visualization tools (PowerBI, Tableau, Looker, etc.) is an advantage but not core. ¢ ¢ ¢ ¢ ¢ Previous experience with data migrations, modernization, or refactoring legacy ETL processes to modern cloud architectures is a strong plus. ¢ Bonus: Exposure to open-source data tools (dbt, Delta Lake, Apache Iceberg, Amundsen, Great Expectations, etc.) and knowledge of DevOps/MLOps processes. Professional Attributes: ¢ ¢ ¢ ¢ Strong analytical and problem-solving skills; attention to detail and commitment to code quality and documentation. Ability to communicate technical designs and issues effectively with team members and stakeholders. Proven self-starter, fast learner, and collaborative team player who thrives in dynamic, fast-paced environments. Passion for mentoring, sharing knowledge, and raising the technical bar for data engineering practices. Desirable Experience: ¢ ¢ ¢ ¢ Contributions to open source data engineering/tools communities. Implementing data cataloging, stewardship, and data democratization initiatives. Hands-on work with DataOps/DevOps pipelines for code and data. Knowledge of ML pipeline integration (feature stores, model serving, lineage/monitoring integration) is beneficial. © HASHEDIN BY DELOITTE 2025 EDUCATIONAL QUALIFICATIONS: ¢ ¢ ¢ ¢ Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience). Certifications in cloud platforms (AWS, GCP, Azure) and/or data engineering (AWS Data Analytics, GCP Data Engineer, Databricks). Experience working in an Agile environment with exposure to CI/CD, Git, Jira, Confluence, and code review processes. Prior work in highly regulated or large-scale enterprise data environments (finance, healthcare, or similar) is a plus. © HASHEDIN BY DELOITTE 2025
POSITION: Engineering Manager (Python/ Generative AI) LOCATION: Bengaluru/Gurugram/Kolkata/Hyderabad/Chennai/Pune EXPERIENCE 5 - 10 years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration, and high performance HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. So, what impact will you make? Visit us @ https://hashedin.com EXPERIENCE/SKILLS FOR AN IDEAL CANDIDATE: Extensive experience (5-10 years) in backend architecture and design for scalable, distributed systems using Python and modern frameworks (FastAPI, Django REST Framework, Flask). Proficiency in building Asynchronous code. Must have a good knowledge of a web framework like FastAPI, DRF, or Flask, with specific, hands-on experience using asyncio to build scalable, I/O-bound services. Proven expertise in architectural decision-making, evaluating trade-offs, and designing robust, maintainable solutions for complex business problems. Strong background in code review, enforcing best practices, mentoring team members, and driving continuous improvement in code quality and maintainability. Deep experience in Multi-Agent System workflow and design, including orchestration, communication protocols, and agent lifecycle management using frameworks such as LangChain, AutoGen, CrewAI, or similar. Advanced skills in database schema design and optimization, including relational (PostgreSQL, MySQL) and NoSQL databases, with a focus on scalability, normalization, and performance. Expertise in API design, including RESTful, asynchronous, and event-driven APIs, ensuring security, scalability, and maintainability. Experience in breaking down complex requirements into actionable tasks, creating detailed work breakdown structures, and aligning deliverables with business objectives. Strong skills in effort estimation, resource planning, and risk assessment, ensuring timely and predictable delivery of high-quality solutions. Strong proficiency with Python testing frameworks like pytest, with a focus on writing comprehensive unit, functional, and integration tests. Solid understanding of Python packaging, dependency management, and virtual environments, with hands-on experience using tools like Poetry, uv, pip, and virtualenv/venv. Strong understanding of basics of SQL reading and writing SQL queries, a basic understanding of database interaction tools schema design, and database optimization. Hands-on experience with Python data libraries (Pandas, NumPy) Good knowledge of API development and testing including but not limited to HTTP, RESTful services, Postman, and allied cloud-based services like API Gateway. Should have a keen eye for architecture. Understand the trade-off between architectural choices, both on a theoretical level and an applied level. Good exposure of LLM SDKs(e.g., OpenAI, Anthropic, Azure OpenAI, Google Gemini). Understanding of LLM orchestration and lifecycle management, including prompt engineering, agent state management, and debugging agentic loops. Familiarity with Retrieval-Augmented Generation (RAG) patterns and practical experience with vector databases (e.g., Pinecone, Weaviate, ChromaDB, or pgvector) for managing long-term memory and knowledge bases for agents. Strong grasp of Agentic AI concepts, including the ability to design, build, and orchestrate autonomous agents that can reason, plan, and execute tasks using a predefined set of tools. Experience with multi-agent systems and frameworks (LangChain, AutoGen, Google ADK, CrewAI) or building complex chains and agentic workflows. Familiarity with emerging open standards for AI interoperability, including the Model Context Protocol (MCP) for secure agent-tool communication and the Agent2Agent (A2A) protocol for multi-agent collaboration. Strong understanding on at-least one cloud platform (AWS, GCP, Azure) to deploy, manage, and scale applications. Strong proficiency with Git for version control, including hands-on experience with collaborative workflows on platforms like GitHub or Bitbucket (e.g., branching, pull/merge requests, and code reviews). Experience in presenting Proof of Concepts (POC) findings, including performance benchmarks, potential risks, and strategic recommendations to both technical and non-technical stakeholders. Proven ability to translate successful POCs into well- architected, scalable, and production-ready solutions. Good to have hands-on experience with AI coding assistants like GitHub Copilot and familiarity with agent development platforms such as Google's Agentspace or similar tools. OVERVIEW OF THE ROLE: This role serves as a paradigm for the application of team software development processes and deployment procedures. Additionally, the incumbent actively contributes to the establishment of best practices and methodologies within the team. Own the end-to-end architecture and technical direction for multi-agent systems and AI-driven workflows. ¢ Lead code reviews, enforce best practices, and mentor team members to ensure high standards of code quality and maintainability. ¢ Collaborate with stakeholders to gather requirements, define technical specifications, and break down work into actionable tasks. ¢ Drive the design and optimization of database schemas and APIs, ensuring scalability, security, and performance. ¢ Provide accurate effort estimations, resource planning, and risk management for project deliverables. ¢ Present architectural decisions, technical findings, and recommendations to both technical and business audiences. ¢ Contribute to the establishment of best practices, methodologies, and a culture of innovation within the team. ¢ Ensure quality delivery through industry best practices and standards in building performant, scalable, and secure APIs and agentic workflows. ¢ Stay abreast of emerging technologies and frameworks, proactively applying them to solve evolving business challenges. EDUCATIONAL QUALIFICATIONS: B.E. / B. Tech, MCA, M.E / M. Tech
POSITION : Tech Architect LOCATION : Bengaluru, Chennai, Gurugram, Hyderabad, Pune, Mumbai, Kolkata EXPERIENCE : 9+ years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. WHY SHOULD YOU JOIN US? With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn, your work will make an impact that matters. So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion, collaboration and high performance HashedIn is the place to be! From learning to leadership, this is your chance to take your software engineering career to the next level. Visit us @ https://hashedin.com EXPERIENCE/SKILLS FOR AN IDEAL CANDIDATE: B.E/B.Tech, MCA, M.E/M.Tech graduate with 9+ Years of experience (This includes 4 years of experience as an application architect or data architect) Java/Python/DE GCP/AWS/AZURE Gen AI-enabled application design pattern knowledge is a value addition. Excellent technical background with a breadth of knowledge across analytics, cloud architecture, distributed applications, integration, API design, etc. Experience in technology stack selection and the definition of solution, technology, and integration architectures for small to mid-sized applications and cloud-hosted platforms. Strong understanding of various design and architecture patterns. Strong experience in developing scalable architecture. Experience implementing and governing software engineering processes, practices, tools, and standards for development teams. Proficient in effort estimation techniques; will actively support project managers and scrum masters in planning the implementation and will work with test leads on the definition of an appropriate test strategy for the realization of a quality solution. Extensive experience as a technology/ engineering subject matter expert i. e. high level Solution definition, sizing, and RFI/RFP responses. Aware of the latest technology trends, engineering processes, practices, and metrics. Architecture experience with PAAS and SAAS platforms hosted on Azure AWS or GCP. Infrastructure sizing and design experience for on-premise and cloud-hosted platforms. Ability to understand the business domain & requirements and map them to technical solutions. Outstanding interpersonal skills. Ability to connect and present to CXOs from client organizations. Strong leadership, business communication consulting and presentation skills. Positive, service-oriented personality. OVERVIEW OF THE ROLE: This role serves as a paradigm for the application of team software development processes and deployment procedures. Additionally, the incumbent actively contributes to the establishment of best practices and methodologies within the team. Craft & deploy resilient APIs, bridging cloud infrastructure & software development with seamless API design, development, & deployment. Works at the intersection of infrastructure and software engineering by designing and deploying data and pipeline management frameworks built on top of open-source components, including Hadoop, Hive, Spark, HBase, Kafka streaming, Tableau, Airflow and other cloud-based data engineering services like S3, Redshift, Athena, Kinesis, etc. Collaborate with various teams to build and maintain the most innovative, reliable, secure and cost-effective distributed solutions. Design and develop big data and real-time analytics and streaming solutions using industry-standard technologies. Deliver the most complex and valuable components of an application on time as per the specifications. Plays the role of a Team Lead, manages, or influences a large portion of an account or small project in its entirety, demonstrating understanding of and consistently incorporating practical value with theoretical knowledge to make balanced technical decisions. Recognizes requirements inconsistencies, accurately schedules, and tracks progress providing visibility and proactively alerts the team and reporting authority on the same. Good work breakdown and estimation. Writes clear and concise specifications for outsourced work, creates a work breakdown structure that uses existing services to deliver a functional implementation, and supports the development team with significant product decisions; seen as a major contributor to architecture, feature set, etc., of product releases. Actively participates in customer communication, presentations, and handling critical issues. Leads assigned client and company resources in performing their roles on time and within budget. An individual contributor who is a role model for the application of theteam software development process and deployment process; someonewho contributes to best practices and methodologies for the team. EDUCATIONAL QUALIFICATIONS: B.E. / B. Tech, MCA, M.E / M. Tech,
 
                         
                    