Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Data Engineer – Databricks, Delta Live Tables, Data Pipelines Location: Bhopal / Hyderabad / Pune (On-site) Experience Required: 5+ Years Employment Type: Full-Time Job Summary: We are seeking a skilled and experienced Data Engineer with a strong background in designing and building data pipelines using Databricks and Delta Live Tables. The ideal candidate should have hands-on experience in managing large-scale data engineering workloads and building scalable, reliable data solutions in cloud environments. Key Responsibilities: Design, develop, and manage scalable and efficient data pipelines using Databricks and Delta Live Tables . Work with structured and unstructured data to enable analytics and reporting use cases. Implement data ingestion , transformation , and cleansing processes. Collaborate with Data Architects, Analysts, and Data Scientists to ensure data quality and integrity. Monitor data pipelines and troubleshoot issues to ensure high availability and performance. Optimize queries and data flows to reduce costs and increase efficiency. Ensure best practices in data security, governance, and compliance. Document architecture, processes, and standards. Required Skills: Minimum 5 years of hands-on experience in data engineering . Proficient in Apache Spark , Databricks , Delta Lake , and Delta Live Tables . Strong programming skills in Python or Scala . Experience with cloud platforms such as Azure , AWS , or GCP . Proficient in SQL for data manipulation and analysis. Experience with ETL/ELT pipelines , data wrangling , and workflow orchestration tools (e.g., Airflow, ADF). Understanding of data warehousing , big data ecosystems , and data modeling concepts. Familiarity with CI/CD processes in a data engineering context. Nice to Have: Experience with real-time data processing using tools like Kafka or Kinesis. Familiarity with machine learning model deployment in data pipelines. Experience working in an Agile environment. Show more Show less
Posted 6 days ago
4.0 - 6.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Title: Sr. Data Engineer Location: Office-Based (Ahmedabad, India) About Hitech Hitech is a leading provider of Data, Engineering Services, and Business Process Solutions. With robust delivery centers in India and global sales offices in the USA, UK, and the Netherlands, we enable digital transformation for clients across industries including Manufacturing, Real Estate, and e-Commerce. Our Data Solutions practice integrates automation, digitalization, and outsourcing to deliver measurable business outcomes. We are expanding our engineering team and looking for an experienced Sr. Data Engineer to design scalable data pipelines, support ML model deployment, and enable insight-driven decisions. Position Summary We are seeking a Data Engineer / Lead Data Engineer with deep experience in data architecture, ETL pipelines, and advanced analytics support. This role is crucial for designing robust pipelines to process structured and unstructured data, integrate ML models, and ensure data reliability. The ideal candidate will be proficient in Python, R, SQL, and cloud-based tools, and possess hands-on experience in creating end-to-end data engineering solutions that support data science and analytics teams. Key Responsibilities Design and optimize data pipelines to ingest, transform, and load data from diverse sources. Build programmatic ETL pipelines using SQL and related platforms. Understand complex data structures and perform data transformation effectively. Develop and support ML models such as Random Forest, SVM, Clustering, Regression, etc. Create and manage scalable, secure data warehouses and data lakes. Collaborate with data scientists to structure data for analysis and modeling. Define solution architecture for layered data stacks ensuring high data quality. Develop design artifacts including data flow diagrams, models, and functional documents. Work with technologies such as Python, R, SQL, MS Office, and SageMaker. Conduct data profiling, sampling, and testing to ensure reliability. Collaborate with business stakeholders to identify and address data use cases. Qualifications & Experience 4 to 6 years of experience in data engineering, ETL development, or database administration. Bachelor’s degree in Mathematics, Computer Science, or Engineering (B.Tech/B.E.). Postgraduate qualification in Data Science or related discipline preferred. Strong proficiency in Python, SQL, Advanced MS Office tools, and R. Familiarity with ML concepts and integrating models into pipelines. Experience with NoSQL systems like MongoDB, Cassandra, or HBase. Knowledge of Snowflake, Databricks, and other cloud-based data tools. ETL tool experience and understanding of data integration best practices. Data modeling skills for relational and NoSQL databases. Knowledge of Hadoop, Spark, and scalable data processing frameworks. Experience with SciKit, TensorFlow, Pytorch, GPT, PySpark, etc. Ability to build web scrapers and collect data from APIs. Experience with Airflow or similar tools for pipeline automation. Strong SQL performance tuning skills in large-scale environments. What We Offer Competitive compensation package based on skills and experience. Opportunity to work with international clients and contribute to high-impact data projects. Continuous learning and professional growth within a tech-forward organization. Collaborative and inclusive work environment. If you're passionate about building data-driven infrastructure to fuel analytics and AI applications, we look forward to connecting with you. Anand Soni Hitech Digital Solutions Show more Show less
Posted 6 days ago
7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About US At Particleblack, we drive innovation through intelligent experimentation with Artificial Intelligence. Our multidisciplinary team—comprising solution architects, data scientists, engineers, product managers, and designers—collaborates with domain experts to deliver cutting-edge R&D solutions tailored to your business. Our ecosystem empowers rapid execution with plug-and-play tools, enabling scalable, AI-powered strategies that fast-track your digital transformation. With a focus on automation and seamless integration, we help you stay ahead—letting you focus on your core, while we accelerate your growth Responsibilities & Qualifications Data Architecture Design: Develop and implement scalable and efficient data architectures for batch and real-time data processing.Design and optimize data lakes, warehouses, and marts to support analytical and operational use cases. ETL/ELT Pipelines: Build and maintain robust ETL/ELT pipelines to extract, transform, and load data from diverse sources.Ensure pipelines are highly performant, secure, and resilient to handle large volumes of structured and semi-structured data. Data Quality and Governance: Establish data quality checks, monitoring systems, and governance practices to ensure the integrity, consistency, and security of data assets. Implement data cataloging and lineage tracking for enterprise-wide data transparency. Collaboration with Teams:Work closely with data scientists and analysts to provide accessible, well-structured datasets for model development and reporting. Partner with software engineering teams to integrate data pipelines into applications and services. Cloud Data Solutions: Architect and deploy cloud-based data solutions using platforms like AWS, Azure, or Google Cloud, leveraging services such as S3, BigQuery, Redshift, or Snowflake. Optimize cloud infrastructure costs while maintaining high performance. Data Automation and Workflow Orchestration: Utilize tools like Apache Airflow, n8n, or similar platforms to automate workflows and schedule recurring data jobs. Develop monitoring systems to proactively detect and resolve pipeline failures. Innovation and Leadership: Research and implement emerging data technologies and methodologies to improve team productivity and system efficiency. Mentor junior engineers, fostering a culture of excellence and innovation.| Required Skills: Experience: 7+ years of overall experience in data engineering roles, with at least 2+ years in a leadership capacity. Proven expertise in designing and deploying large-scale data systems and pipelines. Technical Skills: Proficiency in Python, Java, or Scala for data engineering tasks. Strong SQL skills for querying and optimizing large datasets. Experience with data processing frameworks like Apache Spark, Beam, or Flink. Hands-on experience with ETL tools like Apache NiFi, dbt, or Talend. Experience in pub sub and stream processing using Kafka/Kinesis or the like Cloud Platforms: Expertise in one or more cloud platforms (AWS, Azure, GCP) with a focus on data-related services. Data Modeling: Strong understanding of data modeling techniques (dimensional modeling, star/snowflake schemas). Collaboration: Proven ability to work with cross-functional teams and translate business requirements into technical solutions. Preferred Skills: Familiarity with data visualization tools like Tableau or Power BI to support reporting teams. Knowledge of MLOps pipelines and collaboration with data scientists. Show more Show less
Posted 6 days ago
0.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Senior Data Engineer (Contract) Location: Bengaluru, Karnataka, India About the Role: We're looking for an experienced Senior Data Engineer (6-8 years) to join our data team. You'll be key in building and maintaining our data systems on AWS. You'll use your strong skills in big data tools and cloud technology to help our analytics team get valuable insights from our data. You'll be in charge of the whole process of our data pipelines, making sure the data is good, reliable, and fast. What You'll Do: Design and build efficient data pipelines using Spark / PySpark / Scala . Manage complex data processes with Airflow , creating and fixing any issues with the workflows ( DAGs ). Clean, transform, and prepare data for analysis. Use Python for data tasks, automation, and building tools. Work with AWS services like S3, Redshift, EMR, Glue, and Athena to manage our data infrastructure. Collaborate closely with the Analytics team to understand what data they need and provide solutions. Help develop and maintain our Node.js backend, using Typescript , for data services. Use YAML to manage the settings for our data tools. Set up and manage automated deployment processes ( CI/CD ) using GitHub Actions . Monitor and fix problems in our data pipelines to keep them running smoothly. Implement checks to ensure our data is accurate and consistent. Help design and build data warehouses and data lakes. Use SQL extensively to query and work with data in different systems. Work with streaming data using technologies like Kafka for real-time data processing. Stay updated on the latest data engineering technologies. Guide and mentor junior data engineers. Help create data management rules and procedures. What You'll Need: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 6-8 years of experience as a Data Engineer. Strong skills in Spark and Scala for handling large amounts of data. Good experience with Airflow for managing data workflows and understanding DAGs . Solid understanding of how to transform and prepare data. Strong programming skills in Python for data tasks and automation.. Proven experience working with AWS cloud services (S3, Redshift, EMR, Glue, IAM, EC2, and Athena ). Experience building data solutions for Analytics teams. Familiarity with Node.js for backend development. Experience with Typescript for backend development is a plus. Experience using YAML for configuration management. Hands-on experience with GitHub Actions for automated deployment ( CI/CD ). Good understanding of data warehousing concepts. Strong database skills - OLAP/OLTP Excellent command of SQL for data querying and manipulation. Experience with stream processing using Kafka or similar technologies. Excellent problem-solving, analytical, and communication skills. Ability to work well independently and as part of a team. Bonus Points: Familiarity with data lake technologies (e.g., Delta Lake, Apache Iceberg). Experience with other stream processing technologies (e.g., Flink, Kinesis). Knowledge of data management, data quality, statistics and data governance frameworks. Experience with tools for managing infrastructure as code (e.g., Terraform). Familiarity with container technologies (e.g., Docker, Kubernetes). Experience with monitoring and logging tools (e.g., Prometheus, Grafana).
Posted 6 days ago
7.5 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. - Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Experience with data pipeline orchestration tools such as Apache Airflow or similar. - Strong understanding of ETL processes and data warehousing concepts. - Familiarity with cloud platforms like AWS, Azure, or Google Cloud. - Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information: - The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education Show more Show less
Posted 6 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As a Consultant/Senior Consultant/Manager in our Technology & Transformation you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - To do this, following are the desired qualification and required skills: Good hands-on experience in GCP services including Big Query, Cloud Storage, Dataflow, Cloud Dataproc, Cloud Composer/Airflow, and IAM. Must have proficient experience in GCP Databases : Bigtable, Spanner, Cloud SQL and Alloy DB Proficiency either in SQL, Python, Java, or Scala for data processing and scripting. Experience in development and test automation processes through the CI/CD pipeline (Git, Jenkins, SonarQube, Artifactory, Docker containers) Experience in orchestrating data processing tasks using tools like Cloud Composer or Apache Airflow. Strong understanding of data modeling, data warehousing and big data processing concepts. Solid understanding and experience of relational database concepts and technologies such as SQL, MySQL, PostgreSQL or Oracle. Design and implement data migration strategies for various database types ( PostgreSQL, Oracle, Alloy DB etc.) Deep understanding of at least 1 Database type with ability to write complex SQLs. Experience with NoSQL databases such as MongoDB, Scylla, Cassandra, or DynamoDB is a plus Optimize data pipelines for performance and cost-efficiency, adhering to GCP best practices. Implement data quality checks, data validation, and monitoring mechanisms to ensure data accuracy and integrity. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Ability to work independently and manage multiple priorities effectively. Preferably having expertise in end to end DW implementation UG: B. Tech /B.E. in Any Specialization. Location and way of working: Base location: Bengaluru/Hyderabad/Mumbai/Bhubaneshwar/Coimbatore/Delhi This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. Your role as a Consultant/Senior Consultant/Manager: We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Consultant/Senior Consultant/Manager across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation Committed to creating purpose - Creating a sense of vision and purpose Agile - Achieving high-quality results through collaboration and Team unity Skilled at building diverse capability - Developing diverse capabilities for the future Persuasive / Influencing - Persuading and influencing stakeholders Collaborating - Partnering to build new solutions Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization Effective communication – Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyone's valued for who they are, use empathy to understand others to adapt our behaviors' and attitudes to become more inclusive. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterized by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognize there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. Show more Show less
Posted 6 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As a Senior Consultant/Manager in our Technology & Transformation you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Roles and Responsibilities: The Data Engineer will work on data engineering projects for various business units, focusing on delivery of complex data management solutions by leveraging industry best practices. They work with the project team to build the most efficient data pipelines and data management solutions that make data easily available for consuming applications and analytical solutions. A Data engineer is expected to possess strong technical skills. Key Characteristics Technology champion who constantly pursues skill enhancement and has inherent curiosity to understand work from multiple dimensions. Interest and passion in Big Data technologies and appreciates the value that can be brought in with an effective data management solution. Has worked on real data challenges and handled high volume, velocity, and variety of data. Excellent analytical & problem-solving skills, willingness to take ownership and resolve technical challenges. Contributes to community building initiatives like CoE, CoP. Mandatory skills: Azure - Master ELT - Skill Data Modeling - Skill Data Integration & Ingestion - Skill Data Manipulation and Processing - Skill GITHUB, Action, Azure DevOps - Skill Data factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - Skill Optional skills: Experience in project management, running a scrum team. Experience working with BPC, Planning. Exposure to working with external technical ecosystem. MKDocs documentation Location and way of working: Base location: Bengaluru This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. Your role as a Senior Consultant/Manager: We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Senior Consultant/Manager across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation Committed to creating purpose - Creating a sense of vision and purpose Agile - Achieving high-quality results through collaboration and Team unity Skilled at building diverse capability - Developing diverse capabilities for the future Persuasive / Influencing - Persuading and influencing stakeholders Collaborating - Partnering to build new solutions Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization Effective communication – Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyone's valued for who they are, use empathy to understand others to adapt our behaviors' and attitudes to become more inclusive. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterized by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognize there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. Show more Show less
Posted 6 days ago
1.0 - 4.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Role : Data Scientist Experience : 1 to 4 Years Work Mode : WFO / Hybrid /Remote if applicable Immediate Joiners Preferred Required Skills & Qualification : An ideal candidate will have experience, as we are building an AI-powered workforce intelligence platform that helps businesses optimize talent strategies, enhance decision making, and drive operational efficiency. Our software leverages cutting-edge AI, NLP, and data science to extract meaningful insights from vast amounts of structured and unstructured workforce data. As part of our new AI team, you will have the opportunity to work on real-world AI applications, contribute to innovative NLP solutions, and gain hands on experience in building AI-driven products from the ground up. Required Skills & Qualification Strong experience in Python programming 1-3 years of experience in Data Science/NLP (Freshers with strong NLP projects are welcome). Proficiency in Python, PyTorch, Scikit-learn, and NLP libraries (NLTK, SpaCy, Hugging Face). Basic knowledge of cloud platforms (AWS, GCP, or Azure). Experience with SQL for data manipulation and analysis. Assist in designing, training, and optimizing ML/NLP models using PyTorch, NLTK, Scikit- learn, and Transformer models (BERT, GPT, etc.). Familiarity with MLOps tools like Airflow, MLflow, or similar. Experience with Big Data processing (Spark, Pandas, or Dask). Help deploy AI/ML solutions on AWS, GCP, or Azure. Collaborate with engineers to integrate AI models into production systems. Expertise in using SQL and Python to clean, preprocess, and analyze large datasets. Learn & Innovate Stay updated with the latest advancements in NLP, AI, and ML frameworks. Strong analytical and problem-solving skills. Willingness to learn, experiment, and take ownership in a fast-paced startup environment. Nice To Have Requirements For The Candidate Desire to grow within the company Team player and Quicker learner Performance-driven Strong networking and outreach skills Exploring aptitude & killer attitude Ability to communicate and collaborate with the team at ease. Drive to get the results and not let anything get in your way. Critical and analytical thinking skills, with a keen attention to detail. Demonstrate ownership and strive for excellence in everything you do. Demonstrate a high level of curiosity and keep abreast of the latest technologies & tools Ability to pick up new software easily and represent yourself peers and co-ordinate during meetings with Customers. What We Offer We offer a market-leading salary along with a comprehensive benefits package to support your well-being. Enjoy a hybrid or remote work setup that prioritizes work-life balance and personal well being. We invest in your career through continuous learning and internal growth opportunities. Be part of a dynamic, inclusive, and vibrant workplace where your contributions are recognized and rewarded. We believe in straightforward policies, open communication, and a supportive work environment where everyone thrives. (ref:hirist.tech) Show more Show less
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a highly skilled and hands-on Data Engineer to join Controls Technology to support the design, development, and implementation of our next-generation Data Mesh and Hybrid Cloud architecture. This role is critical in building scalable, resilient, and future-proof data pipelines and infrastructure that enable the seamless integration of Controls Technology data within a unified platform. The Data Engineer will work closely with the Data Mesh and Cloud Architect Lead to implement data products, ETL/ELT pipelines, hybrid cloud integrations, and governance frameworks that support data-driven decision-making across the enterprise. Key Responsibilities: Data Pipeline Development: Design, build, and optimize ETL/ELT pipelines for structured and unstructured data. Develop real-time and batch data ingestion pipelines using distributed data processing frameworks. Ensure pipelines are highly performant, cost-efficient, and secure. Apache Iceberg & Starburst Integration: Work extensively with Apache Iceberg for data lake storage optimization and schema evolution. Manage Iceberg Catalogs and ensure seamless integration with query engines. Configure and maintain Hive MetaStore (HMS) for Iceberg-backed tables and ensure proper metadata management. Utilize Starburst and Stargate to enable distributed SQL-based analytics and seamless data federation. Optimize performance tuning for large-scale querying and federated access to structured and semi-structured data. Data Mesh Implementation: Implement Data Mesh principles by developing domain-specific data products that are discoverable, interoperable, and governed. Collaborate with data domain owners to enable self-service data access while ensuring consistency and quality. Hybrid Cloud Data Integration: Develop and manage data storage, processing, and retrieval solutions across AWS and on-premise environments. Work with cloud-native tools such as AWS S3, RDS, Lambda, Glue, Redshift, and Athena to support scalable data architectures. Ensure hybrid cloud data flows are optimized, secure, and compliant with organizational standards. Data Governance & Security: Implement data governance, lineage tracking, and metadata management solutions. Enforce security best practices for data encryption, role-based access control (RBAC), and compliance with policies such as GDPR and CCPA. Performance Optimization & Monitoring: Monitor and optimize data workflows, performance tuning of queries, and resource utilization. Implement logging, alerting, and monitoring solutions using CloudWatch, Prometheus, or Grafana to ensure system health. Collaboration & Documentation: Work closely with data architects, application teams, and business units to ensure seamless integration of data solutions. Maintain clear documentation of data models, transformations, and architecture for internal reference and governance. Required Technical Skills: Programming & Scripting: Strong proficiency in Python, SQL, and Shell scripting. Experience with Scala or Java is a plus. Data Processing & Storage: Hands-on experience with Apache Spark, Kafka, Flink, or similar distributed processing frameworks. Strong knowledge of relational (PostgreSQL, MySQL, Oracle) and NoSQL databases (DynamoDB, MongoDB). Expertise in Apache Iceberg for managing large-scale data lakes, schema evolution, and ACID transactions. Experience working with Iceberg Catalogs, Hive MetaStore (HMS), and integrating Iceberg-backed tables with query engines. Familiarity with Starburst and Stargate for federated querying and cross-platform data access. Cloud & Hybrid Architecture: Experience working with AWS data services (S3, Redshift, Glue, Athena, EMR, RDS). Understanding of hybrid data storage and integration between on-prem and cloud environments. Infrastructure as Code (IaC) & DevOps: Experience with Terraform, AWS CloudFormation, or Kubernetes for provisioning infrastructure. CI/CD pipeline experience using GitHub Actions, Jenkins, or GitLab CI/CD. Data Governance & Security: Familiarity with data cataloging, lineage tracking, and metadata management. Understanding of RBAC, IAM roles, encryption, and compliance frameworks (GDPR, SOC2, etc.). Required Soft Skills: Problem-Solving & Analytical Thinking - Ability to troubleshoot complex data issues and optimize workflows. Collaboration & Communication - Comfortable working with cross-functional teams and articulating technical concepts to non-technical stakeholders. Ownership & Proactiveness - Self-driven, detail-oriented, and able to take ownership of tasks with minimal supervision. Continuous Learning - Eager to explore new technologies, improve skill sets, and stay ahead of industry trends. Qualifications: 4-6 years of experience in data engineering, cloud infrastructure, or distributed data processing. Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Technology, or a related field. Hands-on experience with data pipelines, cloud services, and large-scale data platforms. Strong foundation in SQL, Python, Apache Iceberg, Starburst, and cloud-based data solutions (AWS preferred), Apache Airflow Orchestration ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Data Architecture ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 1 week ago
12.0 - 20.0 years
35 - 60 Lacs
Mumbai
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Join the innovative team at Kyndryl as a Client Technical Solutioner and unlock your potential to shape the future of technology solutions. As a key player in our organization, you will embark on an exciting journey where you get to work closely with customers, understand their unique challenges, and provide them with cutting-edge technical solutions and services. Picture yourself as a trusted advisor – collaborating directly with customers to unravel their business needs, pain points, and technical requirements. Your expertise and deep understanding of our solutions will empower you to craft tailored solutions that address their specific challenges and drive their success. Your role as a Client Technical Solutioner is pivotal in developing domain-specific solutions for our cutting-edge services and offerings. You will be at the forefront of crafting tailored domain solutions and cost cases for both simple and complex, long-term opportunities, demonstrating we meet our customers' requirements while helping them overcome their business challenges. At Kyndryl, we believe in the power of collaboration and your expertise will be essential in supporting our Technical Solutioning and Solutioning Managers during customer technology and business discussions, even at the highest levels of Business/IT Director/LOB. You will have the chance to demonstrate the value of our solutions and products, effectively communicating their business and technical benefits to decision makers and customers. In this role, you will thrive as you create innovative technical solutions that align with industry trends and exceed customer expectations. Your ability to collaborate seamlessly with internal stakeholders will enable you to gather the necessary documents and technical insights to deliver compelling bid submissions. Not only will you define winning cost models for deals, but you will also lead these deals to profitability, ensuring the ultimate success of both our customers and Kyndryl. You will play an essential role in contract negotiations, up to the point of signature, and facilitate a smooth engagement hand-over process. As the primary source of engagement management and solution design within your technical domain, you will compile, refine, and take ownership of final solution documents. Your technical expertise will shine through as you present these documents in a professional and concise manner, showcasing your mastery of the subject matter. You’ll have the opportunity to contribute to the growth and success of Kyndryl by standardizing our go-to-market pitches across various industries. By creating differentiated propositions that align with market requirements, you will position Kyndryl as a leader in the industry, opening new avenues of success for our customers and our organization. Join us as a Client Technical Solutioner at Kyndryl and unleash your potential to shape the future of technical solutions while enjoying a stimulating and rewarding career journey filled with innovation, collaboration, and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience 10 – 15 Years (Specialist Seller / Consultant) is a must with 3 – 4 years of relevant experience of Gen AI / Agentic AI Proven past experience in Analytics Should have real world experience in Design & Implementation of scalable, fault-tolerant & secure Architectures on any one of the major hyper-scalers (AWS / Azure / GCP ) for Analytics Excellent communication skills to engage with clients and influence decisions. High level of competence in preparing Architectural documentation and presentations. Must be organized, self-sufficient and can manage multiple initiatives simultaneously. Must have the ability to coordinate with other teams and vendors, independently Deep knowledge of Services offerings and technical solutions in a practice Demonstrated experience translating distinctive technical knowledge into actionable customer insights and solutions Prior consultative selling experience Externally recognized as an expert in the technology and/or solutioning areas, to include technical certifications supporting subdomain focus area(s) Responsible for Prospecting & Qualifying leads, do the relevant Product / Market Research independently, in response to Customer’s requirement / Pain Point. Advising and Shaping Client Requirements to produce high-level designs and technical solutions in response to opportunities and requirements from Customers and Partners. Work with both internal / external stakeholders to identify business requirements, develop solutions to meet those requirements / build the Opportunity. Understand & analyze the application requirements in Client RFPs Design software applications based on the requirements within specified architectural guidelines & constraints. Lead, Design and implement Proof of Concepts & Pilots to demonstrate the solution to Clients /prospects. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Openshift / Kubernetes administrator with 06-10 years of experience in OpenShift Platform , Kubernetes (K8), EKS, AKS,GKE etc. Person will primarily focus on administering and supporting the OpenShift Container Platform ecosystem. This includes managing container tenant provisioning, isolation, and capacity. Person will work directly with infrastructure as code-based automation to manage the capacity of the overall platform and deliver new capacity and capabilities as necessary. Having a strong background in Linux administration, virtualization, networking, and security will be required in successfully fulfilling this role; and in providing first class level 3 support. Additionally, understanding application development lifecycles as well practical experience in working with continuous integration and continuous deployment tools as part of the container lifecycle will be useful. Need good experience in Devops tools like CI/CD, Jenkins, GitLab , AirFlow etc. Development of System Life cycle to manage and operate the Infrastructure. Should have depth knowledge for automation tools like Terraform , Ansible , StackStorm etc. Good Knowledge of open APIs and Integrate with Orchestration Tool [CMP] Show more Show less
Posted 1 week ago
5.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe.You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence
Posted 1 week ago
3.0 - 7.0 years
11 - 15 Lacs
Mumbai
Work from Office
A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. - Grade Specific An expert on the principles and practices associated with data platform engineering, particularly within cloud environments, and demonstrates proficiency in specific technical areas related to cloud-based data infrastructure, automation, and scalability.Key responsibilities encompass:Team Leadership and ManagementSupervising a team of platform engineers, with a focus on team dynamics and the efficient delivery of cloud platform solutions.Technical Guidance and Decision-MakingProviding technical leadership and making pivotal decisions concerning platform architecture, tools, and processes. Balancing hands-on involvement with strategic oversight.Mentorship and Skill DevelopmentGuiding team members through mentorship, enhancing their technical proficiencies, and nurturing a culture of continual learning and innovation in platform engineering practices.In-Depth Technical ProficiencyPossessing a comprehensive understanding of platform engineering principles and practices, and demonstrating expertise in crucial technical areas such as cloud services, automation, and system architecture.Community ContributionMaking significant contributions to the development of the platform engineering community, staying informed about emerging trends, and applying this knowledge to drive enhancements in capability. Skills (competencies)
Posted 1 week ago
5.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 1 week ago
5.0 - 7.0 years
5 - 9 Lacs
Kochi
Work from Office
Job Title - + + Management Level: Location:Kochi, Coimbatore, Trivandrum Must have skills:Databricks including Spark-based ETL, Delta Lake Good to have skills:Pyspark Job Summary We are seeking a highly skilled and experienced Senior Data Engineer to join our growing Data and Analytics team. The ideal candidate will have deep expertise in Databricks and cloud data warehousing, with a proven track record of designing and building scalable data pipelines, optimizing data architectures, and enabling robust analytics capabilities. This role involves working collaboratively with cross-functional teams to ensure the organization leverages data as a strategic asset. Your responsibilities will include: Roles and Responsibilities Design, build, and maintain scalable data pipelines and ETL processes using Databricks and other modern tools. Architect, implement, and manage cloud-based data warehousing solutions on Databricks (Lakehouse Architecture) Develop and maintain optimized data lake architectures to support advanced analytics and machine learning use cases. Collaborate with stakeholders to gather requirements, design solutions, and ensure high-quality data delivery. Optimize data pipelines for performance and cost efficiency. Implement and enforce best practices for data governance, access control, security, and compliance in the cloud. Monitor and troubleshoot data pipelines to ensure reliability and accuracy. Lead and mentor junior engineers, fostering a culture of continuous learning and innovation. Excellent communication skills Ability to work independently and along with client based out of western Europe. Professional and Technical Skills 3.5-5 years of experience in Data Engineering roles with a focus on cloud platforms. Proficiency in Databricks, including Spark-based ETL, Delta Lake, and SQL. Strong experience with one or more cloud platforms (AWS preferred). Handson Experience with Delta lake, Unity Catalog, and Lakehouse architecture concepts. Strong programming skills in Python and SQL; experience with Pyspark a plus. Solid understanding of data modeling concepts and practices (e.g., star schema, dimensional modeling). Knowledge of CI/CD practices and version control systems (e.g., Git). Familiarity with data governance and security practices, including GDPR and CCPA compliance. Additional Information Experience with Airflow or similar workflow orchestration tools. Exposure to machine learning workflows and MLOps. Certification in Databricks, AWS Familiarity with data visualization tools such as Power BI (do not remove the hyperlink)Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)
Posted 1 week ago
15.0 - 20.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Neo4j, Stardog Good to have skills : JavaMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Neo4j.- Good To Have Skills: Experience with Java.- Strong understanding of data modeling and graph database concepts.- Experience with data integration tools and ETL processes.- Familiarity with data quality frameworks and best practices.- Proficient in programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 5 years of experience in Neo4j.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
5.0 - 10.0 years
4 - 8 Lacs
Chennai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Your day will involve working on data architecture and engineering tasks to support business operations and decision-making. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Develop and maintain data pipelines for efficient data processing.- Implement ETL processes to ensure seamless data migration and deployment.- Collaborate with cross-functional teams to design and optimize data solutions.- Conduct data quality assessments and implement improvements for data integrity. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data architecture principles.- Experience in designing and implementing data solutions.- Proficient in SQL and other data querying languages.- Knowledge of cloud platforms such as AWS or Azure. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
4 - 8 Lacs
Pune
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Neo4j, Stardog Good to have skills : JavaMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Neo4j.- Good To Have Skills: Experience with Java.- Strong understanding of data modeling and graph database concepts.- Experience with data integration tools and ETL processes.- Familiarity with data quality frameworks and best practices.- Proficient in programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 5 years of experience in Neo4j.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
6.0 - 11.0 years
15 - 30 Lacs
Hyderabad
Hybrid
Design, build, Job Title: Data Engineer / Integration Engineer Job Summary: We are seeking a highly skilled Data Engineer / Integration Engineer to join our team. The ideal candidate will have expertise in Python, workflow orchestration, cloud platforms (GCP/Google BigQuery), big data frameworks (Apache Spark or similar), API integration, and Oracle EBS. The role involves designing, developing, and maintaining scalable data pipelines, integrating various systems, and ensuring data quality and consistency across platforms. Knowledge of Ascend.io is a plus. Key Responsibilities: Design, build, and maintain scalable data pipelines and workflows. Develop and optimize ETL/ELT processes using Python and workflow automation tools. Implement and manage data integration between various systems, including APIs and Oracle EBS. Work with Google Cloud Platform (GCP) or Google BigQuery (GBQ) for data storage, processing, and analytics. Utilize Apache Spark or similar big data frameworks for efficient data processing. Develop robust API integrations for seamless data exchange between applications. Ensure data accuracy, consistency, and security across all systems. Monitor and troubleshoot data pipelines, identifying and resolving performance issues. Collaborate with data analysts, engineers, and business teams to align data solutions with business goals. Document data workflows, processes, and best practices for future reference. Required Skills & Qualifications: Strong proficiency in Python for data engineering and workflow automation. Experience with workflow orchestration tools (e.g., Apache Airflow, Prefect, or similar). Hands-on experience with Google Cloud Platform (GCP) or Google BigQuery (GBQ) . Expertise in big data processing frameworks , such as Apache Spark . Experience with API integrations (REST, SOAP, GraphQL) and handling structured/unstructured data. Strong problem-solving skills and ability to optimize data pipelines for performance. Experience working in an agile environment with CI/CD processes. Strong communication and collaboration skills. Preferred Skills & Nice-to-Have: Experience with Ascend.io platform for data pipeline automation. Knowledge of SQL and NoSQL databases . Familiarity with Docker and Kubernetes for containerized workloads. Exposure to machine learning workflows is a plus. Why Join Us? Opportunity to work on cutting-edge data engineering projects. Collaborative and dynamic work environment. Competitive compensation and benefits. Professional growth opportunities with exposure to the latest technologies. How to Apply: Interested candidates can apply by sending their resume to 8892751405 / deekshith.naidu@estuate.com or through Naukri
Posted 1 week ago
3.0 years
0 Lacs
Greater Chennai Area
On-site
Who You'll Work With Driving lasting impact and building long-term capabilities with our clients is not easy work. You are the kind of person who thrives in a high performance/high reward culture - doing hard things, picking yourself up when you stumble, and having the resilience to try another way forward. In return for your drive, determination, and curiosity, we'll provide the resources, mentorship, and opportunities you need to become a stronger leader faster than you ever thought possible. Your colleagues—at all levels—will invest deeply in your development, just as much as they invest in delivering exceptional results for clients. Every day, you'll receive apprenticeship, coaching, and exposure that will accelerate your growth in ways you won’t find anywhere else. When you join us, you will have Continuous learning Our learning and apprenticeship culture, backed by structured programs, is all about helping you grow while creating an environment where feedback is clear, actionable, and focused on your development. The real magic happens when you take the input from others to heart and embrace the fast-paced learning experience, owning your journey. A voice that matters From day one, we value your ideas and contributions. You’ll make a tangible impact by offering innovative ideas and practical solutions. We not only encourage diverse perspectives, but they are critical in driving us toward the best possible outcomes. Global community With colleagues across 65+ countries and over 100 different nationalities, our firm’s diversity fuels creativity and helps us come up with the best solutions for our clients. Plus, you’ll have the opportunity to learn from exceptional colleagues with diverse backgrounds and experiences. World-class benefits On top of a competitive salary (based on your location, experience, and skills), we provide a comprehensive benefits package, which includes medical, dental, mental health, and vision coverage for you, your spouse/partner, and children. Your Impact As a Data Engineer I at McKinsey & Company, you will play a key role in designing, building, and deploying scalable data pipelines and infrastructure that enable our analytics and AI solutions. You will work closely with product managers, developers, asset owners, and client stakeholders to turn raw data into trusted, structured, and high-quality datasets used in decision-making and advanced analytics. Your core responsibilities will include Developing robust, scalable data pipelines for ingesting, transforming, and storing data from multiple structured and unstructured sources using Python/SQL. Creating and optimizing data models and data warehouses to support reporting, analytics, and application integration. Working with cloud-based data platforms (AWS, Azure, or GCP) to build modern, efficient, and secure data solutions. Contributing to R&D projects and internal asset development. Contributing to infrastructure automation and deployment pipelines using containerization and CI/CD tools. Collaborating across disciplines to integrate data engineering best practices into broader analytical and generative AI (gen AI) workflows. Supporting and maintaining data assets deployed in client environments with a focus on reliability, scalability, and performance. Furthermore, you will have opportunity to explore and contribute to solutions involving generative AI, such as vector embeddings, retrieval-augmented generation (RAG), semantic search, and LLM-based prompting, especially as we integrate gen AI capabilities into our broader data ecosystem. Your Qualifications and Skills Bachelor’s degree in computer science, engineering, mathematics, or a related technical field (or equivalent practical experience). 3+ years of experience in data engineering, analytics engineering, or a related technical role. Strong Python programming skills with demonstrated experience building scalable data workflows and ETL/ELT pipelines. Proficient in SQL with experience designing normalized and denormalized data models. Hands-on experience with orchestration tools such as Airflow, Kedro, or Azure Data Factory (ADF). Familiarity with cloud platforms (AWS, Azure, or GCP) for building and managing data infrastructure. Discernable communication skills, especially around breaking down complex structures into digestible and relevant points for a diverse set of clients and colleagues, at all levels. High-value personal qualities including critical thinking and creative problem-solving skills; an ability to influence and work in teams. Entrepreneurial mindset and ownership mentality are must; desire to learn and develop, within a dynamic, self-led organization. Hands-on experience with containerization technologies (Docker, Docker-compose). Hands on experience with automation frameworks (Github Actions, CircleCI, Jenkins, etc.). Exposure to generative AI tools or concepts (e.g., OpenAI, Cohere, embeddings, vector databases). Experience working in Agile teams and contributing to design and architecture discussions. Contributions to open-source projects or active participation in data engineering communities. Show more Show less
Posted 1 week ago
2.0 - 7.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Job Title - S&C Global Network - AI - CMT DE- Consultant Management Level:9- Consultant Location:Open Must-have skills: Data Engineering Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary : We are looking for a passionate and results-driven Data Engineer to join our growing data team. You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure that support data-driven decision-making across the organization. Roles & Responsibilities: Design, build, and maintain robust, scalable, and efficient data pipelines (ETL/ELT). Work with structured and unstructured data across a wide variety of data sources. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Optimize data systems and architecture for performance, scalability, and reliability. Monitor data quality and support initiatives to ensure clean, accurate, and consistent data. Develop and maintain data models and metadata. Implement and maintain best practices in data governance, security, and compliance. Professional & Technical Skills: 2+ years in data engineering or related fields Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Strong programming skills in Python, Scala, or Java. Experience with big data technologies such as Spark, Hadoop, or Hive. Familiarity with cloud platforms like AWS, Azure, or GCP, especially services like S3, Redshift, BigQuery, or Azure Data Lake. Experience with orchestration tools like Airflow, Luigi, or similar. Solid understanding of data warehousing concepts and data modeling techniques. Good problem-solving skills and attention to detail. Experience with modern data stack tools like dbt, Snowflake, or Databricks. Knowledge of CI/CD pipelines and version control (e.g., Git). Exposure to containerization (Docker, Kubernetes) and infrastructure as code (Terraform, CloudFormation). Additional Information: - The ideal candidate will possess a strong educational background in quantitative discipline and experience in working with Hi-Tech clients - This position is based at our Bengaluru (preferred) and other AI Accenture locations. About Our Company | Accenture Qualification Experience: 4+ years Educational Qualification: Btech/ BE
Posted 1 week ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. How will you make an impact in this role? There are hundreds of opportunities to make your mark on technology and life at American Express. Here’s just some of what you’ll be doing: Lead the design, architecture, and implementation of GCP-based infrastructure projects Collaborate with IT, development, and operations teams to integrate cloud strategies and architecture Responsible for all technical aspects of software development for assigned applications - performs hands-on coding, technical design and development of systems. Develops code specifications. Involved in unit, integration and user acceptance testing. Assists in resolution of defects/ production support issues as required. Analyze requirements/ user stories to appropriately support basic design activities Perform core technical aspects of software development for assigned applications including, developing prototypes, and writing new code Function as an active member of an agile team through consistent development practices (tools, common components, and documentation). Participate in or lead integration tests as defined in the test specifications, including event logging and reporting of results. Perform assigned unit and assembly testing of software components. Participate in code reviews and execute assigned automated build test scripts. Debug software components, identifying, fixing and verifying the remediation of code defects. Work on assigned product features for ongoing sprints and manage a subset of technical requirements based on industry trends, new technologies, known defects, and issue s. Identifies opportunities to adopt innovative technologies. Flexible and willingness to learn and work in any technology platform/product Own all technical aspects of software development architecture, for the assigned applications Collaborate with the rest of the engineering team to design and launch new features Minimum Qualifications Bachelor’s Degree in CS or CSE or Equivalent. 5-8 expertise in Google Cloud Platform (GCP) with Python Expertise in Google Cloud Platform (GCP) with Pub sub ,Cloud Run ,Cloud Functions , DataProc, Big Query and Airflow/Cloud Composer. Hand on experience in building integration/interfaces based on webservices (SOAP and REST using JSON, XML), File based interfaces (Batch Processing), Database (SQL and PLSQL). Hands-on experience Security concepts like - API Security, Encryptions, Vault and Masking Experience with web services, open API development and its concepts. Deep knowledge on the Microservice architecture Preferred Qualifications Functional knowledge in finance/ procure to pay domain Experience in Google cloud Platform (GCP) Knowledge of Collaboration Tools (GitHub, Confluence, Rally). Experience in Continuous Integration and Deployment (Jenkins). Knowledge of eCP and Cloud hosting platforms. Co-Pilot knowledge Agile/SAFe practices in building software We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
_VOIS Intro About _VOIS: _VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, _VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. _VOIS Centre Intro About _VOIS India: In 2009, _VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, _VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Role Related Content (Role specific) Ey Responsibilities Design and Build Data Pipelines:** Develop scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and S3. Create efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes. Build and manage applications using Python, SQL, Databricks, and various AWS technologies. Utilize QuickSight to create insightful data visualizations and dashboards. Quickly develop innovative Proof-of-Concept (POC) solutions to address emerging needs. Provide support and manage the ongoing operation of data services. Automate repetitive tasks and build reusable frameworks to improve efficiency. Work with teams to design and develop data products that support marketing and other business functions. Ensure data services are reliable, maintainable, and seamlessly integrated with existing systems. Required Skills And Experience Bachelor’s degree in Computer Science, Engineering, or a related field. Technical Skills: Proficiency in Python with Pandans, PySpark. Hands-on experience with AWS services including S3, Glue Lambda, API Gateway, and SQS. Knowledge of data processing tools like Spark, Hive, Kafka, and Airflow. Experience with batch job scheduling and managing data dependencies. Experience with QuickSight or similar tools. Familiarity with DevOps automation tools like GitLab, Bitbucket, Jenkins, and Maven. Understanding of Delta is would be an added advantage. _VOIS Equal Opportunity Employer Commitment India _VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion , Top 10 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch! Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. About Enterprise Architecture: Enterprise Architecture is an organization within the Chief Technology Office at American Express and it is a key enabler of the company’s technology strategy. The four pillars of Enterprise Architecture include: Architecture as Code: this pillar owns and operates foundational technologies that are leveraged by engineering teams across the enterprise. Architecture as Design: this pillar includes the solution and technical design for transformation programs and business critical projects which need architectural guidance and support. Governance: this pillar is responsible for defining technical standards, and developing innovative tools that automate controls to ensure compliance. Colleague Enablement: this pillar is focused on colleague development, recognition, training, and enterprise outreach. Responsibilities: Designing, developing, and scalable, secure, and resilient applications and data pipelines Support regulatory audits by providing architectural guidance and documentation as needed. Contribute to enterprise architecture initiatives, domain reviews, and solution architecture. Foster innovation by exploring new tools, frameworks, and design methodologies. Qualifications: Preferably a BS or MS degree in computer science, computer engineering, or other technical discipline 6+ years of software engineering experience with strong proficiency in Java and Node.js. Experience with Python and workflow orchestration tools like Apache Airflow is highly desirable. Proven experience in designing and implementing distributed systems and APIs. Familiarity with cloud platforms (e.g., GCP, AWS) and modern CI/CD pipelines. Ability to write clear architectural documentation and present ideas concisely. Demonstrated success working collaboratively in a cross-functional, matrixed environment. Passion for innovation, problem-solving, and driving technology modernization. Experience with micro services architectures and event driven architecture is preferred. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 1 week ago
6.0 - 11.0 years
10 - 12 Lacs
Bengaluru
Work from Office
General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Hardware Engineer, you will plan, design, optimize, verify, and test electronic systems, bring-up yield, circuits, mechanical systems, Digital/Analog/RF/optical systems, equipment and packaging, test systems, FPGA, and/or DSP systems that launch cutting-edge, world class products. Qualcomm Hardware Engineers collaborate with cross-functional teams to develop solutions and meet performance requirements. Minimum Qualifications: Bachelor's degree in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 6+ years of Hardware Engineering or related work experience. OR Master's degree in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 5+ years of Hardware Engineering or related work experience. OR PhD in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 4+ years of Hardware Engineering or related work experience. Exciting opportunity to work on Digital Flows/Methodologies architecture and development in energetic multi-site CAD team at Qualcomm. Our team support Simulation, Emulation, Formal Verification and Post Silicon domains providing ample opportunities to grow and contribute. Responsibilities As a Design Automation Engineer, you will work with RTL, architecture, design, DV, software, and silicon verification users. Develops, maintains, debugs and tests CPU Design Methodologies using Commercial EDA tools Defines and creates flows/scripts to help design teams execute Front-End (RTL) flows seamlessly Create unit, integration, regression, and/or system-level tests to thoroughly validate new features or changes. Work closely with Eng IT teams to setup flows which work well with the Engineering Compute Infra at multiple Datacenters Work closely with design teams to define methodologies, drive flow development, and deploy vendor tools Interfaces with external vendors to define, drive and incorporate the latest design solutions to improve productivity and time to market. Support design engineers on the flow setup and resolve their queries, automate tasks through appropriate tools and scripting. Review and debug code to identify and fix code problems. Qualifications Proficient with Python development and strong working knowledge of Linux operating systems Must have worked on Digital flows/methodologies development in the DV domains. Should have proficient skills with one of DV related tools Xcelium/VCS/vManager/Indago/Verdi or equivalent. Experience with CI/CD platform (like Airflow and Jenkins) and Version Control System (like Perforce and/or Git). MS/BS in Electrical/Computer Engineering with 8-14 years of demonstrated experience in CAD or EDA tools flows architecture, development, and support. Demonstrated experience with various EDA software, flows, and architectures & driving EDA vendors to provide feature enhancements and bugfixes. Ability to document design methodologies & provide training on tools and workflows to design teams Strong skills in debugging and analyzing techniques to understand existing scripts/flows; Ability to work independently and explore new domains Proven track record of pushing Prior experience debugging vendor tool problems Strong written and verbal interpersonal skills and track record of success in a collaborative team environment
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The airflow job market in India is rapidly growing as more companies are adopting data pipelines and workflow automation. Airflow, an open-source platform, is widely used for orchestrating complex computational workflows and data processing pipelines. Job seekers with expertise in airflow can find lucrative opportunities in various industries such as technology, e-commerce, finance, and more.
The average salary range for airflow professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
In the field of airflow, a typical career path may progress as follows: - Junior Airflow Developer - Airflow Developer - Senior Airflow Developer - Airflow Tech Lead
In addition to airflow expertise, professionals in this field are often expected to have or develop skills in: - Python programming - ETL concepts - Database management (SQL) - Cloud platforms (AWS, GCP) - Data warehousing
As you explore job opportunities in the airflow domain in India, remember to showcase your expertise, skills, and experience confidently during interviews. Prepare well, stay updated with the latest trends in airflow, and demonstrate your problem-solving abilities to stand out in the competitive job market. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.