Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Kochi, Kerala, India
On-site
Join a leading IT solutions provider in India, specializing in data integration and analytics. We are looking for an experienced Informatica BDM Developer to enhance our team and help deliver top-notch services to our clients. Role & Responsibilities Design, develop, and implement Informatica BDM solutions for enhanced data integration. Collaborate with cross-functional teams to gather and analyze business requirements. Optimize ETL processes for performance improvements and efficiency. Maintain quality assurance and troubleshooting of existing Informatica jobs. Develop technical documentation and provide support for business users. Stay updated with the latest trends in data integration technologies. Skills & Qualifications Must-Have: Proven experience in Informatica BDM development. Strong knowledge of SQL and relational databases. Experience with Unix/Linux environments. Familiarity with data warehousing concepts. Excellent analytical and problem-solving skills. Preferred: Experience with cloud-based data integration platforms. Knowledge of other ETL tools. Ability to work in an agile environment. Strong communication and teamwork skills. Benefits & Culture Highlights Dynamic and inclusive work environment fostering innovation. Opportunities for professional growth and development. Comprehensive benefits package including health insurance. Skills: bdm,informatica bdm,problem solving,cloud-based data integration,informatica,data integration,teamwork,sql proficiency,data warehousing,problem-solving,idmc,team collaboration,etl,analytical skills,relational databases,etl tools,performance tuning,unix/linux,sql,communication Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
On-site
Job Description It is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. Summary Database Engineer/ Developer - Core Skills Proficiency in SQL and relational database management systems like PostgreSQL or MySQL, along with database design principles. Strong familiarity with Python for scripting and data manipulation tasks, with additional knowledge of Python OOP being advantageous. A good understanding of data security measures and compliance is also required. Demonstrated problem-solving skills with a focus on optimizing database performance and automating data import processes, and knowledge of cloud-based databases like AWS RDS and Google BigQuery. Min 5 years of experience. JD Database Engineer - Data Research Engineering Position Overview At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Show more Show less
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Join a leading IT solutions provider in India, specializing in data integration and analytics. We are looking for an experienced Informatica BDM Developer to enhance our team and help deliver top-notch services to our clients. Role & Responsibilities Design, develop, and implement Informatica BDM solutions for enhanced data integration. Collaborate with cross-functional teams to gather and analyze business requirements. Optimize ETL processes for performance improvements and efficiency. Maintain quality assurance and troubleshooting of existing Informatica jobs. Develop technical documentation and provide support for business users. Stay updated with the latest trends in data integration technologies. Skills & Qualifications Must-Have: Proven experience in Informatica BDM development. Strong knowledge of SQL and relational databases. Experience with Unix/Linux environments. Familiarity with data warehousing concepts. Excellent analytical and problem-solving skills. Preferred: Experience with cloud-based data integration platforms. Knowledge of other ETL tools. Ability to work in an agile environment. Strong communication and teamwork skills. Benefits & Culture Highlights Dynamic and inclusive work environment fostering innovation. Opportunities for professional growth and development. Comprehensive benefits package including health insurance. Skills: bdm,informatica bdm,problem solving,cloud-based data integration,informatica,data integration,teamwork,sql proficiency,data warehousing,problem-solving,idmc,team collaboration,etl,analytical skills,relational databases,etl tools,performance tuning,unix/linux,sql,communication Show more Show less
Posted 1 day ago
10.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
Company Description Wiser Solutions is a suite of in-store and eCommerce intelligence and execution tools. We're on a mission to enable brands, retailers, and retail channel partners to gather intelligence and automate actions to optimize in-store and online pricing, marketing, and operations initiatives. Our Commerce Execution Suite is available globally. Job Description When looking to buy a product, whether it is in a brick and mortar store or online, it can be hard enough to find one that not only has the characteristics you are looking for but is also at a price that you are willing to pay. It can also be especially frustrating when you finally find one, but it is out of stock. Likewise, brands and retailers can have a difficult time getting the visibility they need to ensure you have the most seamless experience as possible in selecting their product. We at Wiser believe that shoppers should have this seamless experience, and we want to do that by providing the brands and retailers the visibility they need to make that belief a reality. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze lots of structured and semi-structured data from lots of different places every day (whether it’s 20 million+ products from 500+ websites or data collected from over 300,000 brick and mortar stores across the country). We help our customers be more competitive by discovering interesting patterns in this data they can use to their advantage, while being uniquely positioned to be able to do this across both online and instore. We are looking for a lead-level software engineer to lead the charge on a team of like-minded individuals responsible for developing the data architecture that powers our data collection process and analytics platform. If you have a passion for optimization, scaling, and integration challenges, this may be the role for you. What You Will Do Think like our customers – you will work with product and engineering leaders to define data solutions that support customers’ business practices. Design/develop/extend our data pipeline services and architecture to implement your solutions – you will be collaborating on some of the most important and complex parts of our system that form the foundation for the business value our organization provides Foster team growth – provide mentorship to both junior team members and evangelizing expertise to those on others. Improve the quality of our solutions – help to build enduring trust within our organization and amongst our customers by ensuring high quality standards of the data we manage Own your work – you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table – some of our best innovations originate within the team Technologies We Use Languages: SQL, Python Infrastructure: AWS, Docker, Kubernetes, Apache Airflow, Apache Spark, Apache Kafka, Terraform Databases: Snowflake, Trino/Starburst, Redshift, MongoDB, Postgres, MySQL Others: Tableau (as a business intelligence solution) Qualifications Bachelors/Master’s degree in Computer Science or relevant technical degree 10+ years of professional software engineering experience Strong proficiency with data languages such as Python and SQL Strong proficiency working with data processing technologies such as Spark, Flink, and Airflow Strong proficiency working of RDMS/NoSQL/Big Data solutions (Postgres, MongoDB, Snowflake, etc.) Solid understanding of streaming solutions such as Kafka, Pulsar, Kinesis/Firehose, etc. Hands-on experience with Docker, Kubernetes, infrastructure as code using Terraform, and Kubernetes package management with Helm charts Solid understanding of ETL/ELT and OLTP/OLAP concepts Solid understanding of columnar/row-oriented data structures (e.g. Parquet, ORC, Avro, etc.) Solid understanding of Apache Iceberg, or other open table formats Proven ability to transform raw unstructured/semi-structured data into structured data in accordance to business requirements Solid understanding of AWS, Linux and infrastructure concepts Proven ability to diagnose and address data abnormalities in systems Proven ability to learn quickly, make pragmatic decisions, and adapt to changing business needs Experience building data warehouses using conformed dimensional models Experience building data lakes and/or leveraging data lake solutions (e.g. Trino, Dremio, Druid, etc.) Experience working with business intelligence solutions (e.g. Tableau, etc.) Experience working with ML/Agentic AI pipelines (e.g. , Langchain, LlamaIndex, etc.) Understands Domain Driven Design concepts and accompanying Microservice Architecture Passion for data, analytics, or machine learning. Focus on value: shipping software that matters to the company and the customer Bonus Points Experience working with vector databases Experience working within a retail or ecommerce environment. Proficiency in other programming languages such as Scala, Java, Golang, etc. Experience working with Apache Arrow and/or other in-memory columnar data technologies Supervisory Responsibility Provide mentorship to team members on adopted patterns and best practices. Organize and lead agile ceremonies such as daily stand-ups, planning, etc Additional Information EEO STATEMENT Wiser Solutions, Inc. is an Equal Opportunity Employer and prohibits Discrimination, Harassment, and Retaliation of any kind. Wiser Solutions, Inc. is committed to the principle of equal employment opportunity for all employees and applicants, providing a work environment free of discrimination, harassment, and retaliation. All employment decisions at Wiser Solutions, Inc. are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, national origin, family or parental status, disability, genetics, age, sexual orientation, veteran status, or any other status protected by the state, federal, or local law. Wiser Solutions, Inc. will not tolerate discrimination, harassment, or retaliation based on any of these characteristics. Show more Show less
Posted 1 day ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Join a leading IT solutions provider in India, specializing in data integration and analytics. We are looking for an experienced Informatica BDM Developer to enhance our team and help deliver top-notch services to our clients. Role & Responsibilities Design, develop, and implement Informatica BDM solutions for enhanced data integration. Collaborate with cross-functional teams to gather and analyze business requirements. Optimize ETL processes for performance improvements and efficiency. Maintain quality assurance and troubleshooting of existing Informatica jobs. Develop technical documentation and provide support for business users. Stay updated with the latest trends in data integration technologies. Skills & Qualifications Must-Have: Proven experience in Informatica BDM development. Strong knowledge of SQL and relational databases. Experience with Unix/Linux environments. Familiarity with data warehousing concepts. Excellent analytical and problem-solving skills. Preferred: Experience with cloud-based data integration platforms. Knowledge of other ETL tools. Ability to work in an agile environment. Strong communication and teamwork skills. Benefits & Culture Highlights Dynamic and inclusive work environment fostering innovation. Opportunities for professional growth and development. Comprehensive benefits package including health insurance. Skills: bdm,informatica bdm,problem solving,cloud-based data integration,informatica,data integration,teamwork,sql proficiency,data warehousing,problem-solving,idmc,team collaboration,etl,analytical skills,relational databases,etl tools,performance tuning,unix/linux,sql,communication Show more Show less
Posted 1 day ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Azure Data Engineer with Databricks Experience: 5 – 10 years Job Level: Senior Engineer / Lead / Architect Notice Period: Immediate Joiner Role Overview Join our dynamic team at Team Geek Solutions, where we specialize in innovative data solutions and cutting-edge technology implementations to empower businesses across various sectors. We are looking for a skilled Azure Data Engineer with expertise in Databricks to join our high-performing data and AI team for a critical client engagement. The ideal candidate will have strong hands-on experience in building scalable data pipelines, data transformation, and real-time data processing using Azure Data Services and Databricks. Key Responsibilities Design, develop, and deploy end-to-end data pipelines using Azure Databricks, Azure Data Factory, and Azure Synapse Analytics. Perform data ingestion, data wrangling, and ETL/ELT processes from various structured and unstructured data sources (e.g., APIs, on-prem databases, flat files). Optimize and tune Spark-based jobs and Databricks notebooks for performance and scalability. Implement best practices for CI/CD, code versioning, and testing in a Databricks environment using DevOps pipelines. Design data lake and data warehouse solutions using Delta Lake and Synapse Analytics. Ensure data security, governance, and compliance using Azure-native tools (e.g., Azure Purview, Key Vault, RBAC). Collaborate with data scientists to enable feature engineering and model training within Databricks. Write efficient SQL and PySpark code for data transformation and analytics. Monitor and maintain existing data pipelines and troubleshoot issues in a production environment. Document technical solutions, architecture diagrams, and data lineage as part of delivery. Mandatory Skills & Technologies Azure Cloud Services: Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Data Lake Storage (Gen2), Azure Key Vault, Azure Functions, Azure Monitor Databricks Platform: Delta Lake, Databricks Notebooks, Job Clusters, MLFlow (optional), Unity Catalog Programming Languages: PySpark, SQL, Python Data Pipelines: ETL/ELT pipeline design and orchestration Version Control & DevOps: Git, Azure DevOps, CI/CD pipelines Data Modeling: Star/Snowflake schema, Dimensional modeling Performance Tuning: Spark job optimization, Data partitioning strategies Data Governance & Security: Azure Purview, RBAC, Data Masking Nice To Have Experience with Kafka, Event Hub, or other real-time streaming platforms Exposure to Power BI or other visualization tools Knowledge of Terraform or ARM templates for infrastructure as code Experience in MLOps and integration with MLFlow for model lifecycle management Certifications (Good To Have) Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Associate / Professional DP-203: Data Engineering on Microsoft Azure Soft Skills Strong communication and client interaction skills Analytical thinking and problem-solving Agile mindset with familiarity in Scrum/Kanban Team player with mentoring ability for junior engineers Skills: data partitioning strategies,azure functions,data analytics,unity catalog,rbac,databricks,elt,devops,azure data factory,delta lake,data factory,spark job optimization,job clusters,azure devops,etl/elt pipeline design and orchestration,data masking,azure key vault,azure databricks,azure data engineer,azure synapse,star/snowflake schema,azure data lake storage (gen2),git,sql,etl,snowflake,azure,python,azure cloud services,azure purview,pyspark,mlflow,ci/cd pipelines,dimensional modeling,sql server,big data technologies,azure monitor,azure synapse analytics,databricks notebooks Show more Show less
Posted 1 day ago
6.0 years
0 Lacs
India
Remote
Job Title: Senior Data Engineer Location: Remote Experience: 6+ years Job Summary: We are seeking a highly skilled Senior Data Engineer with deep expertise in C#, Azure Data Factory (ADF), Databricks, SQL Server, and Python . The ideal candidate will have a strong understanding of modern CI/CD practices and experience in designing, developing, and optimizing complex data pipelines . Key Responsibilities: Design, develop, and maintain robust, scalable, and efficient data pipelines using Azure Data Factory, Databricks, and SQL Server. Write clean, scalable, and efficient code in C# and Python . Build and manage ETL/ELT processes and ensure data integrity and quality. Optimize SQL queries and database performance. Implement best practices in data engineering , including CI/CD pipelines and version control. Work closely with data scientists, analysts, and business stakeholders to understand data needs. Troubleshoot and resolve issues related to data processing and performance. Document technical solutions and processes clearly and concisely. Required Skills & Experience: 6+ years of experience in Data Engineering . Proficiency in C# and Python for data processing and automation. Strong hands-on experience with Azure Data Factory and Azure Databricks . In-depth experience with SQL Server and writing optimized SQL queries. Solid understanding of CI/CD practices and tools (Azure DevOps, Git, etc.). Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Nice to Have: Experience with Delta Lake , Azure Synapse , or Power BI . Knowledge of big data concepts and tools. Familiarity with data governance , security , and compliance standards . Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
India
Remote
Job Title: Senior Data Engineer Experience: 5+ Years Location: Remote Contract Duration: Short Term Work Time: IST Shift Job Description We are seeking a skilled and experienced Senior Data Engineer to develop scalable and optimized data pipelines using the Databricks Lakehouse platform. The role requires proficiency in Apache Spark, PySpark, cloud data services (AWS, Azure, GCP), and solid programming knowledge in Python and Java. The engineer will collaborate with cross-functional teams to design and deliver high-performing data solutions. Responsibilities Data Pipeline Development Build efficient ETL/ELT workflows using Databricks and Spark for batch and streaming data Utilize Delta Lake and Unity Catalog for structured data management Optimize Spark jobs using tuning techniques such as caching, partitioning, and serialization Cloud-Based Implementation Develop and deploy data workflows on AWS (S3, EMR, Glue), Azure (ADLS, ADF, Synapse), and/or GCP (GCS, Dataflow, BigQuery) Manage and optimize data storage, access control, and orchestration using native cloud tools Implement data ingestion and querying with Databricks Auto Loader and SQL Warehousing Programming and Automation Write clean, reusable, and production-grade code in Python and Java Automate workflows using orchestration tools like Airflow, ADF, or Cloud Composer Implement testing, logging, and monitoring mechanisms Collaboration and Support Work closely with data analysts, scientists, and business teams to meet data requirements Support and troubleshoot production workflows Document solutions, maintain version control, and follow Agile/Scrum methodologies Required Skills Technical Skills Databricks: Experience with notebooks, cluster management, Delta Lake, Unity Catalog, and job orchestration Spark: Proficient in transformations, joins, window functions, and tuning Programming: Strong in PySpark and Java, with data validation and error handling expertise Cloud: Experience with AWS, Azure, or GCP data services and security frameworks Tools: Familiarity with Git, CI/CD, Docker (preferred), and data monitoring tools Experience 5–8 years in data engineering or backend development Minimum 1–2 years of hands-on experience with Databricks and Spark Experience with large-scale data migration, processing, or analytics projects Certifications (Optional but Preferred) Databricks Certified Data Engineer Associate Working Conditions Full-time remote work with availability during IST hours Occasional on-site presence may be required during client visits No regular travel required On-call support expected during deployment phases Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Data Engineer – Databricks, Delta Live Tables, Data Pipelines Location: Bhopal / Hyderabad / Pune (On-site) Experience Required: 5+ Years Employment Type: Full-Time Job Summary: We are seeking a skilled and experienced Data Engineer with a strong background in designing and building data pipelines using Databricks and Delta Live Tables. The ideal candidate should have hands-on experience in managing large-scale data engineering workloads and building scalable, reliable data solutions in cloud environments. Key Responsibilities: Design, develop, and manage scalable and efficient data pipelines using Databricks and Delta Live Tables . Work with structured and unstructured data to enable analytics and reporting use cases. Implement data ingestion , transformation , and cleansing processes. Collaborate with Data Architects, Analysts, and Data Scientists to ensure data quality and integrity. Monitor data pipelines and troubleshoot issues to ensure high availability and performance. Optimize queries and data flows to reduce costs and increase efficiency. Ensure best practices in data security, governance, and compliance. Document architecture, processes, and standards. Required Skills: Minimum 5 years of hands-on experience in data engineering . Proficient in Apache Spark , Databricks , Delta Lake , and Delta Live Tables . Strong programming skills in Python or Scala . Experience with cloud platforms such as Azure , AWS , or GCP . Proficient in SQL for data manipulation and analysis. Experience with ETL/ELT pipelines , data wrangling , and workflow orchestration tools (e.g., Airflow, ADF). Understanding of data warehousing , big data ecosystems , and data modeling concepts. Familiarity with CI/CD processes in a data engineering context. Nice to Have: Experience with real-time data processing using tools like Kafka or Kinesis. Familiarity with machine learning model deployment in data pipelines. Experience working in an Agile environment. Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Experience: 5–8 years of in-depth, hands-on expertise with ETL tools and logic, with a strong preference for IDMC (Informatica Cloud). Application Development/Support: Demonstrated success in either application development or support roles. Python Proficiency: Strong understanding of Python, with practical coding experience AWS: Comprehensive knowledge of AWS services and their applications Airflow: creating and managing Airflow DAG scheduling. Unix & SQL: Solid command of Unix commands, shell scripting, and writing efficient SQL scripts Analytical & Troubleshooting Skills: Exceptional ability to analyze data and resolve complex issues. Development Tasks: Proven capability to execute a variety of development activities with efficiency Insurance Domain Knowledge: Familiarity with the Insurance sector is highly advantageous. Production Data Management: Significant experience in managing and processing production data Show more Show less
Posted 1 day ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Responsibilities Requisition Description Develop and execute a data engineering roadmap that aligns with company strategy. Provide strategic and operational leadership within the data domain and across the engineering leadership team. Build and manage a team of data engineers to design and deliver data pipeline solutions. Work closely with BA team and business to deliver on major data initiatives. Leverage technical knowledge to improve the effectiveness of data pipelines and architectures. Design and develop data pipelines for structured, semi-structured, and unstructured data sources. Oversee the movement of large amounts of data into the data lake and manage data integration with multiple systems. Required Skills And Experience Experience with large-scale data engineering pipelines and data visualization tools. Knowledge of CICD, data architectures, pipelines, quality, and code management. Experience in data science, including predictive modeling and machine learning models. Familiarity with SQL and NoSQL databases. Proven track record of designing and developing data lake, data warehouse, ETL, and task orchestrating systems. Strong leadership, communication, time management, and interpersonal skills. Roles And Responsibilities Develop and execute a data engineering roadmap that aligns with company strategy. Provide strategic and operational leadership within the data domain and across the engineering leadership team. Build and manage a team of data engineers to design and deliver data pipeline solutions. Work closely with BA team and business to deliver on major data initiatives. Leverage technical knowledge to improve the effectiveness of data pipelines and architectures. Design and develop data pipelines for structured, semi-structured, and unstructured data sources. Oversee the movement of large amounts of data into the data lake and manage data integration with multiple systems. Show more Show less
Posted 1 day ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role: SDE 2 - Data Website: www.trademo.com Location: Onsite - Gurgaon What will you be doing here? ● Responsible for the maintenance and growth of a 50TB+ data pipeline serving global SaaS products for businesses, including onboarding new data and collaborating with pre-sales to articulate technical solutions ● Solves complex problems across large datasets by applying algorithms, particularly within the domains of Natural Language Processing (NLP) and Large Language Models (LLM) ● Leverage bleeding-edge technology to work with large volumes of complex data ● Be hands-on in development - Python, Pandas, NumPy, ETL frameworks. ● Preferred exposure to distributed computing frameworks like Apache Spark, Kafka, Airflowj ● Along with individual data engineering contributions, actively help peers and junior team members on architecture and code to ensure the development of scalable, accurate, and highly available solutions ● Collaborate with teams and share knowledge via tech talks and promote tech and engineering best practices within the team. Requirement ● B-Tech/M-Tech in Computer Science from IIT or equivalent Tier 1 Colleges. ● 2+ years of relevant work experience in data engineering or related roles. ● Proven ability to efficiently work with a high variety and volume of data (50TB+ pipeline experience is a plus). ● Solid understanding and preferred exposure to NoSQL databases, including Elasticsearch, MongoDB, and GraphDB. ● Basic understanding of working within Cloud infrastructure and Cloud Native Apps (AWS, Azure, IBM , etc.). ● Exposure to core data engineering concepts and tools: Data warehousing, ETL processes, SQL, and NoSQL databases. ● Great problem-solving ability over a larger set of data and the ability to apply algorithms, with a plus for experience using NLP and LLM. ● Willingness to learn and apply new techniques and technologies to extract intelligence from data, with prior exposure to Machine Learning and NLP being a significant advantage. ● Sound understanding of Algorithms and Data Structures. ● Ability to write well-crafted, readable, testable, maintainable, and modular code. Desired Profile: ● A hard-working, humble disposition. ● Desire to make a strong impact on the lives of millions through your work. ● Capacity to communicate well with stakeholders as well as team members and be an effective interface between the Engineering and Product/Business team. ● A quick thinker who can adapt to a fast-paced startup environment and work with minimum supervision Show more Show less
Posted 1 day ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Bea able to align data models with business goals and enterprise architecture Collaborate with Data Architects, Engineers, Business Analysts, and Leadership teams Lead data modelling, governance discussions and decision-making across cross-functional teams Proactively identify data inconsistencies, integrity issues, and optimization opportunities Design scalable and future-proof data models Define and enforce enterprise data modelling standards and best practices Experience working in Agile environments (Scrum, Kanban) Identify impacted applications, size capabilities, and create new capabilities Lead complex initiatives with multiple cross-application impacts, ensuring seamless integration Drive innovation, optimize processes, and deliver high-quality architecture solutions Understand business objectives, review business scenarios, and plan acceptance criteria for proposed solution architecture Discuss capabilities with individual applications, resolve dependencies and conflicts, and reach agreements on proposed high-level approaches and solutions Participate in Architecture Review, present solutions, and review other solutions Work with Enterprise architects to learn and adopt standards and best practices Design solutions adhering to applicable rules and compliances Stay updated with the latest technology trends to solve business problems with minimal change or impact Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 8+ years of proven experience in a similar role, leading and mentoring a team of architects and technical leads Extensive experience with Relational, Dimensional, and NoSQL Data Modelling Experience in driving innovation, optimizing processes, and delivering high-quality solutions Experience in large scale OLAP, OLTP, and hybrid data processing systems Experience in complex initiatives with multiple cross-application impacts Expert in Erwin for Conceptual, Logical, and Physical Data Modelling Expertise in Relational Databases, SQL, indexing and partitioning for databases like Teradata, Snowflake, Azure Synapse or traditional RDBMS Expertise in ETL/ELT architecture, data pipelines, and integration strategies Expertise in Data Normalization, Denormalization and Performance Optimization Exposure to cloud platforms, tools, and AI-based solutions Solid knowledge of 3NF, Star Schema, Snowflake schema, and Data Vault Knowledge of Java, Python, Spring, Spring boot framework, SQL, Mongo DBS, KAFKA, React JS, Dynatrace, Power BI kind of exposure Knowledge of Azure Platform as a Service (PaaS) offerings (Azure Functions, App Service, Event grid) Good knowledge of the latest happenings in the technology world Advanced SQL skills for complex queries, stored procedures, indexing, partitioning, macros, recursive queries, query tuning and OLAP functions Understanding of Data Privacy Regulations, Master Data Management, and Data Quality Proven excellent communication and leadership skills Proven ability to think from a long-term perspective and arrive at intentional and strategic architecture Proven ability to provide consistent solutions across Lines of Business (LOB) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
India
On-site
Responsibilities: Create, manage, and optimize data pipelines for ingesting, processing, and transforming data using AWS services like AWS Glue, AWS Data Pipeline, and AWS Lambda, Databricks for advanced data processing, and Informatica IDMC for data integration and quality. Integrate data from various sources, both internal and external, into AWS and Databricks environments, ensuring data consistency and quality, while leveraging Informatica IDMC for data integration, transformation, and governance. Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and enrich data, making it suitable for analytical purposes using Databricks' Spark capabilities and Informatica IDMC for data transformation and quality. Monitor and optimize data processing and query performance in both AWS and Databricks environments, making necessary adjustments to meet performance and scalability requirements. Utilize Informatica IDMC for optimizing data workflows. Implement security best practices and data encryption methods to protect sensitive data in both AWS and Databricks, while ensuring compliance with data privacy regulations. Employ Informatica IDMC for data governance and compliance. Implement automation for routine tasks, such as data ingestion, transformation, and monitoring, using AWS services like AWS Step Functions, AWS Lambda, Databricks Jobs, and Informatica IDMC for workflow automation. Maintain clear and comprehensive documentation of data infrastructure, pipelines, and configurations in both AWS and Databricks environments, with metadata management facilitated by Informatica IDMC. Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand data requirements and deliver appropriate solutions across AWS, Databricks, and Informatica IDMC. Identify and resolve data-related issues and provide support to ensure data availability and integrity in both AWS, Databricks, and Informatica IDMC environments. Optimize AWS, Databricks, and Informatica resource usage to control costs while meeting performance and scalability requirements. Stay up-to-date with AWS, Databricks, Informatica IDMC services, and data engineering best practices to recommend and implement new technologies and techniques. Requirements: Bachelor’s or master’s degree in computer science, data engineering, or a related field. Minimum 5 years of experience in data engineering, with expertise in AWS services, Databricks, and/or Informatica IDMC. Proficiency in programming languages such as Python, Java, or Scala for building data pipelines. Evaluate potential technical solutions and make recommendations to resolve data issues especially on performance assessment for complex data transformations and long running data processes. Strong knowledge of SQL and NoSQL databases. Familiarity with data modeling and schema design. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. AWS certifications (e.g., AWS Certified Data Analytics - Specialty, AWS Certified Data Analytics - Specialty), Databricks certifications, and Informatica certifications are a plus. Experience with big data technologies like Apache Spark and Hadoop on Databricks. Knowledge of containerization and orchestration tools like Docker and Kubernetes. Familiarity with data visualization tools like Tableau or Power BI. Understanding of DevOps principles for managing and deploying data pipelines. Experience with version control systems (e.g., Git) and CI/CD pipelines. Knowledge of data governance and data cataloguing tools, especially Informatica IDMC. Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
India
Remote
Location: India – Remote Duration: 6+ Months (Contract) Job Description We are seeking an experienced Data Analyst for a long-term remote opportunity. The ideal candidate should have a proven background either in top-tier consultancy firms (such as Wipro, Accenture , etc.) or Oil & Gas industry projects. The role involves working with complex datasets to extract meaningful insights and support business decision-making. Key Responsibilities Collect, clean, and analyze large volumes of data from multiple sources. Identify trends, patterns, and correlations in large datasets. Design and build reports and dashboards using tools such as Power BI, Tableau, or similar. Work closely with cross-functional teams including stakeholders, business users, and IT teams. Translate business requirements into data models and actionable insights. Prepare visualizations and presentations for management and client reporting. Ensure data quality, integrity, and governance compliance. Required Skills & Qualifications 5+ years of professional experience as a Data Analyst. Hands-on expertise with data visualization tools (e.g., Power BI, Tableau). Strong SQL skills and experience with relational databases (e.g., SQL Server, Oracle). Proficiency in Excel (including advanced formulas, pivot tables, etc.). Solid understanding of statistical methods and data analysis techniques. Experience working with consultancy firms (like Wipro, Accenture, TCS, Infosys, etc.) OR experience in Oil & Gas sector projects is mandatory. Strong analytical thinking and problem-solving skills. Excellent communication and stakeholder management abilities. Preferred Qualifications Experience with Python or R for data analysis. Familiarity with data warehousing concepts (e.g., Snowflake, Redshift, BigQuery). Knowledge of ETL tools and processes. Skills: data visualization,tableau,python,data analysis,etl,statistical methods,r,problem-solving,sql,powerbi,communication,excel,stakeholder management,analytical thinking Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Toyota Connected If you want to change the way the world works, transform the automotive industry and positively impact others on a global scale, then Toyota Connected is the right place for you! Within our collaborative, fast-paced environment we focus on continual improvement and work in a highly iterative way to deliver exceptional value in the form of connected products and services that wow and delight our customers and the world around us. What you will do · Design, develop, and maintain automation test frameworks and scripts specifically for API-level testing. · Validate and verify complex Machine Learning applications, ensuring models and data pipelines perform accurately in production environments. · Conduct extensive testing on large-scale data engineering pipelines, including ETL processes, data warehousing solutions, and big data processing frameworks. · Identify, document, and track software defects, proactively working with developers and product teams for resolution. · Execute load and performance testing on APIs and data pipelines to ensure scalability and efficiency. · Collaborate closely with data engineers, ML engineers, software developers, and product teams to define comprehensive test strategies. · Participate actively in agile ceremonies, contributing to sprint planning, reviews, and retrospectives. You are a successful candidate if you have · Bachelor’s degree in computer science, Information Technology, or related fields. · 3+ years of hands-on experience in automation testing, especially focused on API testing using tools such as Postman, REST Assured, or similar. · Experience working with automation frameworks like Cucumber/Karate for API automation and selenium/cypress for web automation · Demonstrable experience testing Machine Learning applications, including model validation, accuracy assessments, and production-readiness. · Proven expertise in testing large-scale data engineering pipelines involving technologies such as Apache Spark, Hadoop, AWS EMR, Kafka, or similar. · Strong scripting/programming skills in Python, Java, or JavaScript. · Familiarity with containerization and CI/CD tools (Docker, Kubernetes, Jenkins, GitLab CI/CD). · Excellent analytical and problem-solving abilities, with strong attention to detail. · Effective communication skills and the ability to clearly document test cases, defects, and technical findings. What is in it for you? Top of the line compensation! You'll be treated like the professional we know you are and left to manage your own time and workload. Yearly gym membership reimbursement & Free catered lunches. No dress code! We trust you are responsible enough to choose what’s appropriate to wear for the day. Opportunity to build products that improves the safety and convenience of millions of customers Cool office space and other awesome benefits! Our Core Values: EPIC Empathetic : We begin making decisions by looking at the world from the perspective of our customers, teammates, and partners. Passionate: We are here to build something great, not just for the money. We are always looking to improve the experience of our millions of customers Innovative : We experiment with ideas to get to the best solution. Any constraint is a challenge, and we love looking for creative ways to solve them. Collaborative: When it comes to people, we think the whole is greater than its parts and that everyone has a role to play in the success! To know more about us, check out our glassdoor page-https://www.glassdoor.co.in/Reviews/TOYOTA-Connected-Corporation-Reviews-E3305334.htm Show more Show less
Posted 1 day ago
2.0 years
0 Lacs
Haryana, Haryana
Remote
You Will : Development of iOS/Android test automation on mobile test harnesses Develop and enhance to the existing automation scripts, tools and framework using Java, TestNG and Appium Execute automated test plans and regression tests for iOS/Android applications Define testing strategies and scope for user stories and technical development tasks Provide estimates on testing efforts to Product and Engineering team members Maintain and improve the test coverage and ratio of automated test Advocate Automated Testing and CI/CD methodology, review and advise testing methods and best practices Identify, investigate, report, and track defects Deliver high-quality features and infrastructure to production Continuously learn new tools, technologies, and testing disciplines Able to work under minimal supervision and quick to adopt new technologies Work collaboratively across multiple teams Communicate all concerns and status with SQA manager on timely manner Qualifications Bachelor’s degree in computer science or a related field or equivalent work experience A track record of improving quality 2+ years of test automation experience with expertise in iOS/Android app testing Experience on TestNG, Java, Appium, XCUITest with expertise in programming skills using Java Experience in Selenium webdriver Experience using Xcode instruments Experience using BrowserStack or similar for app automation Expertise in software QA methodologies, tools, and processes Expertise in test design, test automation frameworks, and scripting tests Experience with MongoDB Experience with Git, DevOps CI/CD pipelines Good knowledge of data warehouses, data lakes and ETL pipelines (AWS, Athena, Redshift Spectrum, Postgres, SQL, etc) is a plus API Automation testing experience using JEST, Mocha, REST Assured or similar frameworks is a plus Excellent communication skills, both oral and written a must Experience with Scrum methodologies and remote teams is a plus! Job Types: Full-time, Permanent Pay: ₹600,000.00 - ₹1,100,000.00 per year Schedule: Day shift Work Location: Hybrid remote in Haryana, Haryana
Posted 1 day ago
0.0 - 2.0 years
0 Lacs
Kochi M.G.Road, Kochi, Kerala
On-site
Data Engineer Experience: 2-4Years Location: Kochin, Kerala (Work From Office) Key Responsibilities: Build and manage data lakes and data warehouses using services like Amazon S3, Redshift, and Athena Design and build secure, scalable, and efficient ETL/ELT pipelines on AWS using services like Glue, Lambda, Step Functions Work on SAP Datasphere to build and maintain Spaces, Data Builders, Views, and Consumption Layers Support data integration between AWS, Datasphere, and various source systems(SAP S4HANA, Non-SAP apps, Flat-files etc) Develop and maintain scalable data models and optimize queries for performance · Monitor and optimize data workflows to ensure reliability, performance, and cost-efficiency Collaborate with Data Analysts and BI teams to provide clean, validated, and well-documented datasets Monitor, troubleshoot, and enhance data workflows and pipelines Ensure data quality, integrity, and governance policies are met Required Skills Strong SQL skills and experience with relational databases like MySQL, or SQL Server Proficient in Python or Scala for data transformation and scripting Familiarity with cloud platforms like AWS (S3, Redshift, Glue), Datasphere, Azure Good-to-Have Skills: AWS Certification – AWS Certified Data Analytics Exposure to modern data stack tools like Snowflake Experience in cloud-based projects and working in an Agile environment Understanding of data governance, security best practices, and compliance standards Job Types: Full-time, Permanent Pay: Up to ₹1,500,000.00 per year Application Question(s): Willing to take up Work from Office mode in Kochi Location? Experience: Data Engineer / ETL Developer: 2 years (Required) AWS: 2 years (Required) SQL and (Python OR Scala): 2 years (Required) Datasphere OR "SAP BW" OR "SAP S/4HANA": 2 years (Required) AWS (S3, Redshift, Glue), Datasphere, Azure: 2 years (Required)
Posted 1 day ago
5.0 years
0 Lacs
Bina, Madhya Pradesh
On-site
Job Information Job Opening ID OTSI_2214_JOB Industry Government/Military Date Opened 06/18/2025 Job Type Full time Work Experience 5+ years Required Skills Python SQL +2 City Bina State/Province Madhya Pradesh Country India Zip/Postal Code 470113 About Us OTSI is a leading global technology company offering solutions, consulting, and managed services for businesses worldwide since 1999. OTSI serves clients from its 15 offices across 6 countries around the globe with a “Follow-the-Sun” model. Headquartered in Overland Park, Kansas, we have a strong presence in North America, Central America, and Asia-Pacific with a Global Delivery Center based in India. These strategic locations offer our customers the competitive advantages of onshore, nearshore, and offshore engagement and delivery options, with 24/7 support. OTSI works with 100+ enterprise customers, of which many are Fortune ranked, OTSI focuses on industry segments such as Banking, Financial Services & Insurance, Healthcare & Life Sciences, Energy & Utilities, Communications & Media Entertainment, Engineering & Telecom, Retail & Consumer Services, Hi-tech, Manufacturing, Engineering, transport logistics, Government, Defense & PSUs. Our focused technologies are: Data & Analytics (Traditional EDW, BI, Big data, Data Engineering, Data Management, Data Modernization, Data Insights) Digital Transformation (Cloud Computing, Mobility, Micro Services, RPA, DevOps) QA & Automation (Manual Testing, Nonfunctional testing, Test Automation, Digital Testing) Enterprise Applications (SAP, Java Full stack, Microsoft, Custom Development) Disruptive Technologies (Edge Computing/IOT, Block Chain, AR/VR, Biometric) Job Description The resources placed at respective work locations should be punctual and regular in attending the office. BPCL’s development requirements would vary during different phases; hence exact requirements would vary from time to time. The Developers will understand the functional requirements and undertake application development as per specifications given by the BPCL project leader. The developers will carry out coding in the platform identified carry out unit testing and interact with BPCL team members for implementing and rolling out the solution. They will adhere to standards laid down by BPCL for development, inline documentation, testing, etc. Create and maintain proper technical documentation of all developments Knowledge transfer to in-house Development team along with documentation. The source code developed by the developers will be the property of BPCL. Should be available on Sundays/ Holidays as per BPCL requirement on a need basis. Requirements Minimum 5 years of Work Experience, of which 3+ year experience is working on data analytics project(s). The project preferably should be related to the manufacturing/process industry. Certification in Machine learning-based courses through certified agencies. Understanding of data modeling, data preparation, ETL, data warehouse Knowledge in scripting languages like PowerShell/Python for automation and familiarity with ML libraries like Scikit-learn, Stats model, etc. Experience in working with any SQL databases (Oracle, Microsoft, etc.)
Posted 1 day ago
0.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Chennai, Tamil Nadu, India Department Operations - Product Quality Job posted on Jun 18, 2025 Employment type FTE About Toyota Connected If you want to change the way the world works, transform the automotive industry and positively impact others on a global scale, then Toyota Connected is the right place for you! Within our collaborative, fast-paced environment we focus on continual improvement and work in a highly iterative way to deliver exceptional value in the form of connected products and services that wow and delight our customers and the world around us. What you will do Design, develop, and maintain automation test frameworks and scripts specifically for API-level testing. Validate and verify complex Machine Learning applications, ensuring models and data pipelines perform accurately in production environments. Conduct extensive testing on large-scale data engineering pipelines, including ETL processes, data warehousing solutions, and big data processing frameworks. Identify, document, and track software defects, proactively working with developers and product teams for resolution. Execute load and performance testing on APIs and data pipelines to ensure scalability and efficiency. Collaborate closely with data engineers, ML engineers, software developers, and product teams to define comprehensive test strategies. Participate actively in agile ceremonies, contributing to sprint planning, reviews, and retrospectives. You are a successful candidate if you have Bachelor’s degree in computer science, Information Technology, or related fields. 3+ years of hands-on experience in automation testing, especially focused on API testing using tools such as Postman, REST Assured, or similar. Experience working with automation frameworks like Cucumber/Karate for API automation and selenium/cypress for web automation Demonstrable experience testing Machine Learning applications, including model validation, accuracy assessments, and production-readiness. Proven expertise in testing large-scale data engineering pipelines involving technologies such as Apache Spark, Hadoop, AWS EMR, Kafka, or similar. Strong scripting/programming skills in Python, Java, or JavaScript. Familiarity with containerization and CI/CD tools (Docker, Kubernetes, Jenkins, GitLab CI/CD). Excellent analytical and problem-solving abilities, with strong attention to detail. Effective communication skills and the ability to clearly document test cases, defects, and technical findings. What is in it for you? Top of the line compensation! You'll be treated like the professional we know you are and left to manage your own time and workload. Yearly gym membership reimbursement & Free catered lunches. No dress code! We trust you are responsible enough to choose what’s appropriate to wear for the day. Opportunity to build products that improves the safety and convenience of millions of customers Cool office space and other awesome benefits! Our Core Values: EPIC Empathetic : We begin making decisions by looking at the world from the perspective of our customers, teammates, and partners. Passionate: We are here to build something great, not just for the money. We are always looking to improve the experience of our millions of customers Innovative : We experiment with ideas to get to the best solution. Any constraint is a challenge, and we love looking for creative ways to solve them. Collaborative: When it comes to people, we think the whole is greater than its parts and that everyone has a role to play in the success!
Posted 1 day ago
0.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Position Summary Company : Fives India Engineering & Projects Pvt. Ltd. Job Title : Data Analyst/Senior Data Analyst (BI developer) Job Location : Chennai, Tamil Nadu, India Job Department : IT Educational Qualification : BE/B.Tech/MCA from a reputed Institute in Computer Science or related field Work Experience : 4 – 8 years Job Description : Fives is a global industrial engineering group based in Paris, France, that designs and supplies machines, process equipment and production lines for the world’s largest industrial sectors including aerospace, automotive, steel, aluminium, glass, cement, logistics and energy. Headquartered in Paris, Fives is located in about 25 countries with more than 9000 employees. Fives is seeking a Data Analyst/ Senior Data Analyst for their office located in Chennai, India. The position is an integral part of the Group IT development team working on custom software solutions for the Group IT requirements. We are looking for analyst specialized in BI development. Required Skills Applicant should have skills/experience in the following area: 4 – 8 years’ of experience in Power BI development Good understanding of data visualization concepts. Proficiency in writing DAX expressions and Power Query Knowledge of SQL and database related technologies Source control such as GIT Proficient in building REST APIs to interact with data sources Familiarity with ETL/ELT concepts and tools such as Talend is a plus Good knowledge of programming, algorithms and data structures Ability to use Agile collaboration tools such as Jira Good communication skills both verbal and written Willingness to learn new technologies and tools Position Type Full-Time/Regular
Posted 1 day ago
4.0 years
0 Lacs
Hyderabad, Telangana
Remote
Data Engineer II Hyderabad, Telangana, India + 2 more locations Date posted Jun 18, 2025 Job number 1829143 Work site Up to 50% work from home Travel 0-25 % Role type Individual Contributor Profession Software Engineering Discipline Data Engineering Employment type Full-Time Overview Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the Microsoft Fabric platform team builds and maintains the operating system and provides customers a unified data stack to run an entire data estate. The platform provides a unified experience, unified governance, enables a unified business model and a unified architecture. The Fabric Data Analytics, Insights, and Curation team is leading the way at understanding the Microsoft Fabric composite services and empowering our strategic business leaders. We work with very large and fast arriving data and transform it into trustworthy insights. We build and manage pipelines, transformation, platforms, models, and so much more that empowers the Fabric product. As an Engineer on our team your core function will be Data Engineering with opportunities in Analytics, Science, Software Engineering, DEVOps, and Cloud Systems. You will be working alongside other Engineers, Scientists, Product, Architecture, and Visionaries bringing forth the next generation of data democratization products. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Qualifications Required /Minimum Qualifications Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 4+ years' experience in business analytics, data science, software development, data modeling or data engineering work o OR Master's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 2+ years' experience in business analytics, data science, software development, or data engineering work o OR equivalent experience 2+ years of experience in software or data engineering, with proven proficiency in C#, Java, or equivalent 2+ years in one scripting language for data retrieval and manipulation (e.g., SQL or KQL) 2+ years of experience with ETL and data cloud computing technologies, including Azure Data Lake, Azure Data Factory, Azure Synapse, Azure Logic Apps, Azure Functions, Azure Data Explorer, and Power BI or equivalent platforms Preferred/Additional Qualifications 1+ years of demonstrated experience implementing data governance practices, including data access, security and privacy controls and monitoring to comply with regulatory standards. Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #fabricdata #dataintegration #azure #synapse #databases #analytics #science Responsibilities You will develop and maintain data pipelines, including solutions for data collection, management, transformation, and usage, ensuring accurate data ingestion and readiness for downstream analysis, visualization, and AI model training You will review, design, and implement end-to-end software life cycles, encompassing design, development, CI/CD, service reliability, recoverability, and participation in agile development practices, including on-call rotation You will review and write code to implement performance monitoring protocols across data pipelines, building visualizations and aggregations to monitor pipeline health. You’ll also implement solutions and self-healing processes that minimize points of failure across multiple product features You will anticipate data governance needs, designing data modeling and handling procedures to ensure compliance with all applicable laws and policies You will plan, implement, and enforce security and access control measures to protect sensitive resources and data You will perform database administration tasks, including maintenance, and performance monitoring. You will collaborate with Product Managers, Data and Applied Scientists, Software and Quality Engineers, and other stakeholders to understand data requirements and deliver phased solutions that meet test and quality programs data needs, and support AI model training and inference You will become an SME of our teams’ products and provide inputs for strategic vision You will champion process, engineering, architecture, and product best practices in the team You will work with other team Seniors and Principles to establish best practices in our organization Embody our culture and values Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work. Industry leading healthcare Educational resources Discounts on products and services Savings and investments Maternity and paternity leave Generous time away Giving programs Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 1 day ago
0.0 - 8.0 years
0 Lacs
Gurugram, Haryana
On-site
About the Role: Grade Level (for internal use): 10 Position summary Our proprietary software-as-a-service helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale. Its micro-marketing engine then delivers the right message at the right time to those customers, ensuring higher conversion rates and a stronger ROI. What You'll Do You will be part of our Data Platform & Product Insights data engineering team. As part of this agile team, you will work in our cloud native environment to Build & support data ingestion and processing pipelines in cloud. This will entail extraction, load and transformation of ‘big data’ from a wide variety of sources, both batch & streaming, using latest data frameworks and technologies Partner with product team to assemble large, complex data sets that meet functional and non-functional business requirements, ensure build out of Data Dictionaries/Data Catalogue and detailed documentation and knowledge around these data assets, metrics and KPIs. Warehouse this data, build data marts, data aggregations, metrics, KPIs, business logic that leads to actionable insights into our product efficacy, marketing platform, customer behaviour, retention etc. Build real-time monitoring dashboards and alerting systems. Coach and mentor other team members. Who you are 4 to 8 years of experience in Big Data and Data Engineering. Strong knowledge of advanced SQL, data warehousing concepts and DataMart designing. Have strong programming skills in SQL, Python/ PySpark etc. Experience in design and development of data pipeline, ETL/ELT process on-premises/cloud. Experience in one of the Cloud providers – GCP, Azure, AWS. Experience with relational SQL and NoSQL databases, including Postgres and MongoDB. Experience workflow management tools: Airflow, AWS data pipeline, Google Cloud Composer etc. Experience with Distributed Versioning Control environments such as GIT, Azure DevOps Building Docker images and fetch/promote and deploy to Production. Integrate Docker container orchestration framework using Kubernetes by creating pods, config Maps, deployments using terraform. Should be able to convert business queries into technical documentation. Strong problem solving and communication skills. Bachelors or an advanced degree in Computer Science or related engineering discipline. Good to have some exposure to Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc. Agile software development methodologies. Working in multi-functional, multi-location teams Grade: 09 / 10 Location: Gurugram Hybrid Model: twice a week work from office Shift Time: 12 pm to 9 pm IST What You'll Love About Us – Do ask us about these! Total Rewards. Monetary, beneficial and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Prepare for the Future. Academy – we are all learners; we are all teachers! Employee Assistance Program. Confidential and Professional Counselling and Consulting. Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! About automotiveMastermind: Who we are: Founded in 2012, automotiveMastermind is a leading provider of predictive analytics and marketing automation solutions for the automotive industry and believes that technology can transform data, revealing key customer insights to accurately predict automotive sales. Through its proprietary automated sales and marketing platform, Mastermind, the company empowers dealers to close more deals by predicting future buyers and consistently marketing to them. automotiveMastermind is headquartered in New York City. For more information, visit automotivemastermind.com. At automotiveMastermind, we thrive on high energy at high speed. We’re an organization in hyper-growth mode and have a fast-paced culture to match. Our highly engaged teams feel passionately about both our product and our people. This passion is what continues to motivate and challenge our teams to be best-in-class. Our cultural values of “Drive” and “Help” have been at the core of what we do, and how we have built our culture through the years. This cultural framework inspires a passion for success while collaborating to win. What we do: Through our proprietary automated sales and marketing platform, Mastermind, we empower dealers to close more deals by predicting future buyers and consistently marketing to them. In short, we help automotive dealerships generate success in their loyalty, service, and conquest portfolios through a combination of turnkey predictive analytics, proactive marketing, and dedicated consultative services. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 315748 Posted On: 2025-06-18 Location: Gurgaon, Haryana, India
Posted 1 day ago
8.0 years
0 Lacs
Gurugram, Haryana
On-site
Location Gurugram, Haryana, India This job is associated with 2 categories Job Id GGN00002083 Information Technology Job Type Full-Time Posted Date 06/18/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Our Values: At United Airlines, we believe that inclusion propels innovation and is the foundation of all that we do. Our Shared Purpose: "Connecting people. Uniting the world." drives us to be the best airline for our employees, customers, and everyone we serve, and we can only do that with a truly diverse and inclusive workforce. Our team spans the globe and is made up of diverse individuals all working together with cutting-edge technology to build the best airline in the history of aviation. With multiple employee-run "Business Resource Group" communities and world-class benefits like health insurance, parental leave, and space available travel, United is truly a one-of-a-kind place to work that will make you feel welcome and accepted. Come join our team and help us make a positive impact on the world. Job overview and responsibilities United's Revenue Management team is growing, and we are seeking a Senior Developer to come join us! The Senior Software Developer will be responsible for the development of critical applications while working with a team of developers. This role will design, develop, document, test, and debug new and existing applications. Additionally, this role will build these applications with a focus on delivering cloud-based solutions. The individual will use groundbreaking technologies and enterprise-grade integration software daily and will be relied upon to help take the team to the next level from a technological standpoint. This individual will utilize effective communication, analytical, and problem-solving skills to help identify, communicate/resolve issues, opportunities, or problems to maximize the benefit of IT and Business investments. The Developer is experienced and self-sufficient in performing their responsibilities, requiring little supervision, but general guidance and direction. Application Software Development Manages and participates in the full development life cycle, including requirements analysis and design using Agile methodologies. Serve as technical expert on development projects Write technical specifications based on conceptual design and stated business requirements Support, maintain, and document software functionality Identify and evaluate new technologies for implementation Analyze code to find causes of errors and revise programs as needed Manages and participates in software design meetings and analyzes user needs to determine technical requirements Collaborate & lead tech teams consisting of employees & vendor company contractors in planning and execution, serving as a technology leader Collaborate with end users to prototype, refine, test, and debug programs to meet needs United values diverse experiences, perspectives, and we encourage everyone who meets the minimum qualifications to apply. While having the “desired” qualifications makes for a stronger candidate, we encourage applicants who may not feel they check ALL of those boxes! We are always looking for individuals who will bring something new to the table! This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor’s degree in computer science, Computer Engineering, Electrical Engineering, Management Information Systems, or related field Bachelor's degree required in Computer Science, Engineering, or related field Proven experience leading technical delivery teams (worldwide) 8+ years of IT and business/industry work experience 5+ Experience developing on large-scale applications with Python, SQL, AWS Services (such as, EC2, S3, RDS, Glue, Batch, Lambda, Step Functions) or combination and Devops (CloudFormation) Working knowledge of CI/CD tools like Harness, Team City, Jenkins Working knowledge of C/C++ Working knowledge of Unix/Linux operating systems Working knowledge of ETL tools Working knowledge of application logging and monitoring tools Effective communication skills Effective technical documentation Team-player Must be self-motivated Analytical thinker Excellent knowledge of Object-Oriented systems design and Application Development Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Airline Experience with the design development of ML/AI models is a plus
Posted 1 day ago
12.0 years
0 Lacs
Pune, Maharashtra
On-site
Job details Employment Type: Full-Time Location: Pune, Maharashtra, India Job Category: Information Systems Job Number: WD30236720 Job Description Senior SAP Data Lead & Architect We are seeking a highly experienced Senior SAP Data Lead & Architect to join our esteemed team in Pune. The ideal candidate will provide strategic leadership in data migration, integration, transformation, overall data management and support data architect of different value streams, ensuring high-quality data solutions aligned with business objectives. The Enterprise SAP Data Lead will work with business counterparts in delivering solutions based on best practices, business processes & SAP expertise. The Enterprise Data Lead will lead India based Data team which will constitute both functional Data Architects as well Data Technical (ETL) consultants. The SAP landscape under the scope of this position would cover multiple SAP instances in JCI globally. Role and Responsibilities: Oversee the entire data migration lifecycle, including planning, execution, validation, and post-migration support. Design and implement robust data architecture solutions to support SAP landscapes. Lead data extraction, transformation, and loading (ETL) processes, ensuring accuracy, efficiency, and performance. Define and implement data governance frameworks, standards, and best practices. Ensure seamless data migration while maintaining data integrity, consistency, and security. Lead a team of JCI internal Data functional & technical consultants and external resources in SAP areas – to deliver multiple SAP data migration projects, Continuous improvements, manage completed data cycle for SAP Rollouts across the Globe. Planning, Design, implementation/execution, delivery, and maintenance of JCI Enterprise SAP template Data solutions, according to business requirements and best practices. Review, Dispositioning, Prioritization and delivery of Data issues during mocks & post cutover. Provide consultation to multiple systems/application owners in JCI, business partners and peer groups regarding long and short-range Data architecture solutions to address business requirements and objectives. Develop solutions and business case alternatives. Actively participates in complex design and technical discussions related to Data Migrations, and decision-making processes. Collaborate and engage with counterparts across SAP pillars such as Tech Services, IT Operations, QA and Business Analytics to consistently deliver SAP application Data Migration projects with a high level of quality and functionality that meets the business user’s requirements. Work with the Enterprise ERP Application Architect team & support functionality assessments for Data solutions in scope of the overall SAP ERP platform. Technical Leadership & Functional Support: Work with SAP data transformation tools like SAP BODS, Python, SQL, SAP Data Intelligence for efficient data handling. Support functional teams across RTR (Record to Report), PTF (Plan to Fulfill), OTC (Order to Cash), PTP (Procure to Pay) to ensure data alignment with business processes. Develop data validation, cleansing, and enrichment strategies to optimize business decision-making. Provide expert guidance on data modeling, database optimization, and performance tuning. Collaborate with business stakeholders to gather data requirements and define data migration scope. Project & Stakeholder Management: Act as the primary liaison between business, IT, and technical teams for all data-related aspects. Work closely with cross-functional teams, including business analysts, project managers, and SAP consultants. Drive data-driven decision-making by providing insights on data analytics, BI/BW, and reporting solutions. Develop comprehensive data migration strategies, ensuring minimal disruption to business operations. Ensure compliance with data security, regulatory, and audit requirements. Continuous Improvement & Innovation: Stay updated on SAP S/4HANA data migration methodologies, SAP BTP data services, and cloud-based data solutions. Implement automation and best practices to streamline data migration and governance processes. Introduce AI/ML-based data quality improvements and predictive analytics solutions. Requirements: Bachelor’s/Master’s degree in Computer Science, Information Technology, or a related field. 12+ years of experience in SAP data migration, data architecture, and functional support. Strong hands-on experience with SAP BODS, LSMW, SAP S/4HANA Migration Cockpit, SAP MDG (Master Data Governance), and third-party ETL tools. Experience in SAP functional modules from Data Migration perspective preferred in streams such as RTR (Record to Report), PTF (Plan to Fulfill), OTC (Order to Cash), PTP (Procure to Pay) Proven ability to lead and execute complex data migration projects across multiple SAP implementations. Expertise in SQL, Python, and scripting languages for data transformation and automation. Strong knowledge of SAP data structures, data modeling, and integration with functional SAP modules. Experience in data governance, master data management (MDM), and data lineage tracking. Ability to develop business cases for data migration and transformation projects. Strong problem-solving and analytical skills, with the ability to tackle complex data issues effectively. Excellent communication and stakeholder management skills, capable of engaging with all levels of leadership
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.
These cities are known for their thriving tech industries and often have a high demand for ETL professionals.
The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.
In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect
As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.
Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)
Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.
Here are 25 interview questions that you may encounter in ETL job interviews:
As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.