Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Job Title: Tableau Developer Summary: We are seeking a highly skilled and experienced Tableau Developer to join our Development department. The ideal candidate will have 6-10 years of experience in developing Tableau solutions. As a Tableau Developer, you will be responsible for designing, developing, and maintaining Tableau dashboards and reports that provide valuable insights to our organization. Roles and Responsibilities: - Collaborate with business stakeholders to understand their reporting and data visualization requirements. - Design and develop Tableau dashboards and reports that effectively present complex data in a visually appealing and user-friendly manner. - Create and maintain data models, data sources, and data connections within Tableau. - Perform data analysis and validation to ensure the accuracy and integrity of the data used in Tableau dashboards and reports. - Optimize Tableau performance by identifying and implementing improvements in data processing and visualization techniques. - Troubleshoot and resolve issues related to Tableau dashboards and reports. - Stay up-to-date with the latest Tableau features and functionalities and propose innovative solutions to enhance our reporting capabilities. - Collaborate with cross-functional teams to integrate Tableau dashboards and reports with other business systems and applications. - Provide training and support to end-users on Tableau functionality and best practices. Qualifications: - Bachelor's degree in Computer Science, Information Systems, or a related field. - 6-10 years of experience as a Tableau Developer or in a similar role. - Experience in connecting Tableau to various databases such as SQL Server, Oracle, MySQL, and cloud-based data sources. - Strong proficiency in Tableau Desktop and Tableau Server. - In-depth knowledge of data visualization best practices and principles. - Proficiency in SQL and data modeling . - Experience with ETL processes and data warehousing concepts. - Excellent problem-solving and analytical skills. - Strong communication and collaboration skills. - Ability to work independently and manage multiple priorities in a fast-paced environment.
Posted 1 month ago
2.0 - 7.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : AWS AdministrationMinimum 2 year(s) of experience is required Educational Qualification : BE Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams to ensure seamless integration and functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and implement software solutions to meet business requirements.- Collaborate with cross-functional teams to ensure successful project delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Good To Have Skills: Experience with AWS Administration.- Strong understanding of data integration and ETL processes.- Experience in designing and implementing scalable applications.- Knowledge of software development best practices. Additional Information:- The candidate should have a minimum of 2 years of experience in Ab Initio.- This position is based at our Hyderabad office.- A BE degree is required. Qualification BE
Posted 1 month ago
12.0 - 15.0 years
18 - 20 Lacs
Pune
Work from Office
Support AI/ML projects by developing data pipelines, managing databases, automating ETL in Python, and collaborating across teams. Skilled in SQL, Python, cloud tools, and data quality. Mail:kowsalya.k@srsinfoway.com
Posted 1 month ago
12.0 - 15.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Microsoft Azure Data Services, Microsoft Azure Analytics ServicesMinimum 12 year(s) of experience is required Educational Qualification : Full Tim Education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will also engage in problem-solving activities, providing insights and recommendations to enhance application performance and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Microsoft Azure Data Services, Microsoft Azure Analytics Services.- Strong understanding of data integration techniques and ETL processes.- Experience with cloud-based data storage and processing solutions.- Familiarity with data governance and compliance standards. Additional Information:- The candidate should have minimum 12 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A Full Tim Education is required. Qualification Full Tim Education
Posted 1 month ago
3.0 - 5.0 years
2 - 6 Lacs
Kolhapur, Pune
Work from Office
Bluebenz Digitizations Private Limited is looking for Data Warehouse Developer to join our dynamic team and embark on a rewarding career journey Lead data warehouse initiatives, manage data warehouse infrastructure and architecture, ensure data integrity and availability, develop ETL processes, collaborate with data engineers and analysts, optimize data storage and retrieval processes, and support data analysis and reporting efforts Ensure data warehouse scalability and performance
Posted 1 month ago
6.0 - 8.0 years
6 - 10 Lacs
Chennai
Work from Office
Company:Kiya.ai Job Title: Functional Consultant Data Migration Experience: 6-8 Years Location: Chennai ** interested candidate drop your resume to saarumathi.r@kiya.ai ** Job Summary: We are looking for an experienced Functional Consultant with deep expertise in Data Migration to join our team in Chennai. The ideal candidate will play a key role in planning, coordinating, and executing data migration activities as part of large-scale enterprise transformation or system implementation projects. Key Responsibilities: Lead and manage end-to-end data migration activities across multiple systems and business domains. Collaborate with business stakeholders, data owners, and technical teams to define data migration strategies, scope, and requirements. Analyze legacy systems, identify data sources, and map data from source to target systems. Support the creation of data cleansing and transformation rules to ensure high data quality. Develop test plans and validate migrated data to ensure accuracy, completeness, and consistency. Provide functional input for ETL processes and support technical teams with business rules and logic. Monitor and troubleshoot data migration issues and ensure timely resolution. Document data migration procedures, issues, and outcomes for audit and knowledge transfer. Essential Skills: Proven experience in Data Migration across complex enterprise environments. Strong understanding of data structures, data mapping, and transformation logic. Hands-on experience with data profiling, cleansing, and validation. Excellent analytical and problem-solving skills. Strong communication skills to liaise with technical teams and business stakeholders. Preferred Qualifications: Experience with ERP or CRM system data migration (e.g., SAP, Oracle, Microsoft Dynamics). Familiarity with data migration tools (e.g., SQL, Talend, Informatica, SSIS, or similar). Understanding of data governance and data quality frameworks. Functional knowledge of business domains such as finance, HR, or supply chain.
Posted 1 month ago
3.0 - 5.0 years
15 - 20 Lacs
Gurugram
Work from Office
Job Summary : We are seeking a skilled and detail-oriented Salesforce CRM Analytics Specialist with 3 - 5 years of experience in Salesforce CRM Analytics, particularly Einstein Discovery. In this role, you will transform CRM data into actionable insights, enabling strategic business decisions across sales, marketing, and customer service. The ideal candidate will have a strong analytical mindset, hands-on experience in building predictive models, and a deep understanding of Salesforce Analytics Cloud (formerly Tableau CRM). Key Responsibilities: • Design, build, and deploy analytics dashboards and data visualizations using CRM Analytics (Tableau CRM). • Leverage Einstein Discovery to build predictive and prescriptive models that support sales performance, customer behaviour, and marketing effectiveness. • Collaborate with cross-functional teams including Sales, Marketing, Customer Service, and Business Intelligence to translate business requirements into analytical solutions. • Conduct data preparation and ETL processes within Salesforce using dataflows, recipes, and connectors. • Develop and maintain scalable data models and datasets optimized for analytics use cases. • Monitor model performance and interpret findings, ensuring high accuracy and relevance of insights. • Create and maintain documentation for analytical workflows, model assumptions, and business logic. • Drive adoption of analytics solutions through training, presentations, and ongoing support to stakeholders. Preferred candidate profile: • 3 - 5 years of hands-on experience with Salesforce CRM Analytics (Tableau CRM / Einstein Analytics). • Proven experience with Einstein Discovery: creating predictive models, interpreting results, and implementing model outputs. • Proficiency in dataflows, SAQL, and recipe-based transformations. • Strong SQL skills and understanding of relational databases. • Excellent analytical, problem-solving, and communication skills. Preferred : • Salesforce certifications such as Tableau CRM & Einstein Discovery Consultant. • Experience with Salesforce Sales Cloud or Industries Cloud integration
Posted 1 month ago
3.0 - 6.0 years
10 - 16 Lacs
Noida
Work from Office
Proficiency in T-SQL, SQL Server Management Studio (SSMS), and SQL Profiler. Experience with SQL Server versions 2019 and above. Strong understanding of database design, normalization, and indexing. Experience with SSIS for ETL processes. Familiarity with SSRS for reporting is a plus.
Posted 1 month ago
3.0 - 8.0 years
0 - 1 Lacs
Hyderabad
Work from Office
Key Responsibilities 1. Incident Management Monitor production systems for issues and respond promptly to incidents. Log, categorize, and prioritize incidents for resolution. Collaborate with development teams to address and resolve issues. Communicate with stakeholders regarding incident status and resolution timelines. expertia.ai+2linkedin.com+2virtusa.com+2tavoq.com 2. Root Cause Analysis (RCA) Conduct thorough investigations to identify the underlying causes of recurring issues. Implement long-term solutions to prevent future occurrences. Document findings and share insights with relevant teams. tech-champion.com+1tealhq.com+1 3. System Monitoring & Performance Optimization Utilize monitoring tools to track system health and performance. Identify and address performance bottlenecks or capacity issues. Ensure systems meet performance benchmarks and service level agreements (SLAs). virtusa.com 4. Release Management & Application Maintenance Assist with the deployment of software updates, patches, and new releases. Ensure smooth transitions from development to production environments. Coordinate with cross-functional teams to minimize disruptions during releases. virtusa.com+1tealhq.com+1tech-champion.com 5. User Support & Troubleshooting Provide end-user support for technical issues. Investigate user-reported problems and offer solutions or workarounds. Maintain clear communication with users regarding issue status and resolution. virtusa.com+1resumehead.com+1 6. Documentation & Knowledge Sharing Maintain detailed records of incidents, resolutions, and system configurations. Create and update operational runbooks, FAQs, and knowledge base articles. Share knowledge with team members to improve overall support capabilities. virtusa.com Essential Tools & Technologies Monitoring & Alerting : Nagios, Datadog, New Relic Log Management & Analysis : Splunk, Elasticsearch, Graylog Version Control : Git, SVN Ticketing Systems : JIRA, ServiceNow Automation & Scripting : Python, Shell scripting Database Management : SQL, Oracle, MySQL cvformat.io+2tealhq.com+2virtusa.com+2 Skills & Competencies Technical Skills Proficiency in system monitoring and troubleshooting. Strong understanding of application performance metrics. Experience with database management and query optimization. Familiarity with cloud platforms and infrastructure.expertia.ai Soft Skills Analytical Thinking : Ability to diagnose complex issues and develop effective solutions. Communication : Clear and concise communication with stakeholders at all levels. Teamwork : Collaborative approach to problem-solving and knowledge sharing. Adaptability : Flexibility to handle changing priorities and technologies. cvformat.io
Posted 1 month ago
4.0 - 7.0 years
6 - 9 Lacs
Noida, India
Work from Office
ETL and Data Validation - Test ETL Processes, ensuring accurate extraction, transformation and loading of data. - Validation source to target mappings, transformation and business rules. SQL & Database testing - Write and execute comples SQL queries to validate data integrity. - Verify data transformation, joins, aggregations in relation database (Oracle, SQL Server, PostgresSQL, etc.) Automation - Nice to have experience in Robot framework using Python Mandatory Competencies ETL - Tester Database - SQL Database - PostgreSQL
Posted 1 month ago
5.0 - 9.0 years
7 - 11 Lacs
Noida
Work from Office
Strong on Data Engineering domain, should be aware or handed large Data Volumes testing, Big Data, ETL testing, worked on Hadoop/HDFS related apps, Python Programming must. ETL and Data Validation - Test ETL Processes, ensuring accurate extraction, transformation and loading of data. - Validation source to target mappings, transformation and business rules. - perform data completeness, accuracy and consistency checks. SQL & Database testing - Write and execute complex SQL queries to validate data integrity. - Verify data transformation, joins, aggregations in relation database (Oracle, SQL Server, PostgreSQL, etc.) Automation - Nice to have experience in Robot framework using Python Mandatory Competencies ETL - Tester Database - SQL Python - Python QE - Automation Testing Approaches
Posted 1 month ago
3.0 - 7.0 years
5 - 9 Lacs
Pune, Bengaluru
Work from Office
Job Title - Streamsets ETL Developer, Associate Location - Pune, India Role Description Currently DWS sources technology infrastructure, corporate functions systems [Finance, Risk, HR, Legal, Compliance, AFC, Audit, Corporate Services etc] and other key services from DB. Project Proteus aims to strategically transform DWS to an Asset Management standalone operating platform; an ambitious and ground-breaking project that delivers separated DWS infrastructure and Corporate Functions in the cloud with essential new capabilities, further enhancing DWS highly competitive and agile Asset Management capability. This role offers a unique opportunity to be part of a high performing team implementing a strategic future state technology landscape for all DWS Corporate Functions globally. We are seeking a highly skilled and motivated ETL developer (individual contributor) to join our integration team. The ETL developer will be responsible for developing, testing and maintaining robust and scalable ETL processes to support our data integration initiatives. This role requires a strong understanding of database, Unix and ETL concepts, excellent SQL skills and experience with ETL tools and databases. Your key responsibilities This role will be primarily responsible for creating good quality software using the standard coding practices. Will get involved with hands-on code development. Thorough testing of developed ETL solutions/pipelines. Do code review of other team members. Take E2E Accountability and ownership of work/projects and work with the right and robust engineering practices. Converting business requirements into technical design Delivery, Deployment, Review, Business interaction and Maintaining environments. Additionally, the role will include other responsibilities, such as: Collaborating across teams Ability to share information, transfer knowledge and expertise to team members Work closely with Stakeholders and other teams like Functional Analysis and Quality Assurance teams. Work with BA and QA to troubleshoot and resolve the reported bugs / issues on applications. Your skills and experience Bachelors Degree from an accredited college or university with a concentration in Science or an IT-related discipline (or equivalent) Hands-on experience with StreamSets, SQL Server and Unix. Experience of developing and optimizing ETL Pipelines for data ingestion, manipulation and integration. Strong proficiency in SQL, including complex queries, stored procedures, functions. Solid understanding of relational database concepts. Familiarity with data modeling concepts (Conceptual, Logical, Physical) Familiarity with HDFS, Kafka, Microservices, Splunk. Familiarity with cloud-based platforms (e.g. GCP, AWS) Experience with scripting languages (e.g. Bash, Groovy). Excellent knowledge of SQL. Experience of delivering within an agile delivery framework Experience with distributed version control tool (Git, Github, BitBucket). Experience within Jenkins or pipelines based modern CI/CD systems
Posted 1 month ago
10.0 - 14.0 years
30 - 35 Lacs
Pune
Work from Office
Job Title - Streamsets ETL Developer, AVP Location - Pune, India Role Description Currently DWS sources technology infrastructure, corporate functions systems [Finance, Risk, HR, Legal, Compliance, AFC, Audit, Corporate Services etc] and other key services from DB. Project Proteus aims to strategically transform DWS to an Asset Management standalone operating platform; an ambitious and ground-breaking project that delivers separated DWS infrastructure and Corporate Functions in the cloud with essential new capabilities, further enhancing DWS highly competitive and agile Asset Management capability. This role offers a unique opportunity to be part of a high performing team implementing a strategic future state technology landscape for all DWS Corporate Functions globally. We are seeking a highly skilled and motivated ETL developer (individual contributor) to join our integration team. The ETL developer will be responsible for developing, testing and maintaining robust and scalable ETL processes to support our data integration initiatives. This role requires a strong understanding of database, Unix and ETL concepts, excellent SQL skills and experience with ETL tools and databases. Your key responsibilities Work with our clients to deliver value through the delivery of high quality software within an agile development lifecycle. Development and thorough testing of developed ETL solutions/pipelines. Define and evolve the architecture of the components you are working on and contribute to architectural decisions at a department and bank-wide level Take E2E Accountability and ownership of work/projects and work with the right and robust engineering practices. Converting business requirements into technical design, perform code review of other team members. Additionally, the role will include other responsibilities, such as: Leading and collaborating across teams. Team management, stakeholder reporting Mentor / coach junior team members in both technical and functional front Bring deep industry knowledge and best practices into the Team. Work closely with Stakeholders and other teams like Functional Analysis and Quality Assurance teams. Work with BA and QA to troubleshoot and resolve the reported bugs / issues in applications. Your skills and experience Bachelors Degree from an accredited college or university with a concentration in Science or an IT-related discipline (or equivalent) 10 - 14 years of Hands-on experience with Oracle / SQL Server and Unix. Experience of developing and optimizing ETL Pipelines for data ingestion, manipulation and integration. Strong proficiency in working with complex queries, stored procedures, functions. Solid understanding of relational database concepts. Familiarity with data modeling concepts (Conceptual, Logical, Physical) Familiarity with HDFS, Kafka, Microservices, Splunk. Familiarity with cloud-based platforms (e.g. GCP, AWS) Experience with scripting languages (e.g. Bash, Groovy). Experience of delivering within an agile delivery framework Experience with distributed version control tool (Git, Github, BitBucket). Experience within Jenkins or pipelines based modern CI/CD systems Nice to have Hands-on experience with StreamSets. Exposure to Python and GCP cloud
Posted 1 month ago
10.0 - 15.0 years
14 - 18 Lacs
Bengaluru
Work from Office
About the Role: Were looking for an experienced Engineering Manager to lead the development of highly scalable, reliable, and secure platform services and database connectors that power mission-critical data pipelines for thousands of enterprise customers. These pipelines connect to modern data warehouses such as Snowflake, BigQuery, and Databricks , as well as Data Lakes like Apache . This is a rare opportunity to own and build foundational systems , solve complex engineering challenges , and lead a high-impact team delivering best-in-class performance at scale. You will play a central role in shaping our platform vision, driving high accountability , and fostering a culture of technical excellence and high performance while working closely with cross-functional stakeholders across product, program, support, and business teams. What Youll Do: Lead, mentor and inspire a team of software engineers who take pride in ownership and delivering impact. Ensure operational excellence through proactive monitoring, automated processes, and a culture of continuous improvement with strong accountability. Drive a strong quality-first mindset , embedding it into the development lifecyclefrom design to deployment. Drive technical leadership through architecture reviews , code guidance, and solving critical platform challenges. Build and operate multi-tenant, distributed backend systems at scale. Act as a technical leader youve operated at least at Tech Lead, Staff Engineer, or Principal Engineer level in your career. Champion a culture of high accountability , clear ownership, and high visibility across engineering and cross-functional stakeholders. Collaborate deeply with Product, Program, Support, and Business functions to drive alignment and execution. Embed principles of observability, reliability, security, and auditability into all aspects of the platform. Inspire the team to pursue engineering excellence , driving best-in-class implementations and visible results. Define and track data-driven KPIs to ensure operational efficiency, performance, and team effectiveness. Take end-to-end ownership of product lines, ensuring on-time delivery and customer success . Contribute to team growth , hiring, and building an inclusive, learning-focused engineering environment. What Were Looking For: 10+ years of experience in backend or systems software development. 2+ years in a formal or informal Engineering Manager, Sr. Engineering Manager, or Tech Lead role in a fast-paced engineering environment. Progression through senior IC roles like Tech Lead, Staff, or Principal Engineer . Strong experience with distributed systems , cloud-native architectures , and multi-tenant platforms. Proven ability to drive cross-team collaboration with product, support, business, and program teams. Demonstrated ability to drive accountability , set clear goals, and raise the performance bar for the team. Expertise in system design, scalability, performance optimization, and cost control. Proven track record of mentoring engineers , guiding architecture, and leading impactful initiatives. Clear communicator, adept at both strategy and execution. Bonus Points: Experience with data engineering platforms , ETL systems , or database internals . Exposure to product-driven companies , especially in infrastructure, SaaS , or backup/data systems . Demonstrated history of fast-tracked growth or high-visibility impact. Led or contributed to re-architecture or greenfield systems at scale.
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Bangalore Rural
Work from Office
Role & responsibilities We are seeking a talented and self-motivated Data Science and Analytics Individual Contributor to join our Vehicle Technology team. This role is perfect for a highly skilled professional who excels in data science, data visualization, Python programming, and the creation of data pipelines for ETL (Extract, Transform, Load) processes. The individual in this position will play a pivotal role in shaping our data-driven decision-making processes and contributing to the success of the company. 1. Data Analysis and Exploration: Conduct thorough data analysis to uncover actionable insights and trends. Collaborate with cross-functional teams to define data requirements and extract relevant data. 2. Data Visualization: Create visually compelling and informative data visualizations using tools like Tableau, Power BI, or Python libraries (e.g., Matplotlib, Seaborn). Develop interactive dashboards to present insights in an accessible and understandable manner. 3. Machine Learning and Predictive Modeling: Build, validate, and deploy machine learning models to solve business problems and predict outcomes. Optimize and fine-tune models for accuracy and efficiency. 4. Python Programming: Develop and maintain Python scripts and applications for data analysis and modeling. Write clean, efficient, and well-documented code for data manipulation and analytics. 5. ETL Pipeline Creation: Design and implement ETL pipelines to extract, transform, and load data from various sources into data warehouses. Ensure data quality, consistency, and accuracy within ETL processes. Enforce data governance best practices to maintain data quality, security, and compliance with industry regulations. Collaborate with IT and Data Engineering teams to optimize data storage and access. Work closely with cross-functional teams to understand business requirements and provide data-driven solutions. Effectively communicate findings and insights to both technical and non-technical stakeholders. Preferred candidate profile Proven experience in data science, data analysis, and predictive modeling. Proficiency in Python programming, data manipulation, and relevant libraries. Strong experience with data visualization tools (We prefer Power BI). Knowledge of ETL processes and pipeline creation. Familiarity with machine learning techniques and frameworks. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Experience with SQL and database management. Knowledge of CANBus protocol is a plus
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Mumbai Suburban
Work from Office
Role & responsibilities We are seeking a talented and self-motivated Data Science and Analytics Individual Contributor to join our Vehicle Technology team. This role is perfect for a highly skilled professional who excels in data science, data visualization, Python programming, and the creation of data pipelines for ETL (Extract, Transform, Load) processes. The individual in this position will play a pivotal role in shaping our data-driven decision-making processes and contributing to the success of the company. 1. Data Analysis and Exploration: Conduct thorough data analysis to uncover actionable insights and trends. Collaborate with cross-functional teams to define data requirements and extract relevant data. 2. Data Visualization: Create visually compelling and informative data visualizations using tools like Tableau, Power BI, or Python libraries (e.g., Matplotlib, Seaborn). Develop interactive dashboards to present insights in an accessible and understandable manner. 3. Machine Learning and Predictive Modeling: Build, validate, and deploy machine learning models to solve business problems and predict outcomes. Optimize and fine-tune models for accuracy and efficiency. 4. Python Programming: Develop and maintain Python scripts and applications for data analysis and modeling. Write clean, efficient, and well-documented code for data manipulation and analytics. 5. ETL Pipeline Creation: Design and implement ETL pipelines to extract, transform, and load data from various sources into data warehouses. Ensure data quality, consistency, and accuracy within ETL processes. Enforce data governance best practices to maintain data quality, security, and compliance with industry regulations. Collaborate with IT and Data Engineering teams to optimize data storage and access. Work closely with cross-functional teams to understand business requirements and provide data-driven solutions. Effectively communicate findings and insights to both technical and non-technical stakeholders. Preferred candidate profile Proven experience in data science, data analysis, and predictive modeling. Proficiency in Python programming, data manipulation, and relevant libraries. Strong experience with data visualization tools (We prefer Power BI). Knowledge of ETL processes and pipeline creation. Familiarity with machine learning techniques and frameworks. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Experience with SQL and database management. Knowledge of CANBus protocol is a plus
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Gurugram
Work from Office
Role & responsibilities We are seeking a talented and self-motivated Data Science and Analytics Individual Contributor to join our Vehicle Technology team. This role is perfect for a highly skilled professional who excels in data science, data visualization, Python programming, and the creation of data pipelines for ETL (Extract, Transform, Load) processes. The individual in this position will play a pivotal role in shaping our data-driven decision-making processes and contributing to the success of the company. 1. Data Analysis and Exploration: Conduct thorough data analysis to uncover actionable insights and trends. Collaborate with cross-functional teams to define data requirements and extract relevant data. 2. Data Visualization: Create visually compelling and informative data visualizations using tools like Tableau, Power BI, or Python libraries (e.g., Matplotlib, Seaborn). Develop interactive dashboards to present insights in an accessible and understandable manner. 3. Machine Learning and Predictive Modeling: Build, validate, and deploy machine learning models to solve business problems and predict outcomes. Optimize and fine-tune models for accuracy and efficiency. 4. Python Programming: Develop and maintain Python scripts and applications for data analysis and modeling. Write clean, efficient, and well-documented code for data manipulation and analytics. 5. ETL Pipeline Creation: Design and implement ETL pipelines to extract, transform, and load data from various sources into data warehouses. Ensure data quality, consistency, and accuracy within ETL processes. Enforce data governance best practices to maintain data quality, security, and compliance with industry regulations. Collaborate with IT and Data Engineering teams to optimize data storage and access. Work closely with cross-functional teams to understand business requirements and provide data-driven solutions. Effectively communicate findings and insights to both technical and non-technical stakeholders. Preferred candidate profile Proven experience in data science, data analysis, and predictive modeling. Proficiency in Python programming, data manipulation, and relevant libraries. Strong experience with data visualization tools (We prefer Power BI). Knowledge of ETL processes and pipeline creation. Familiarity with machine learning techniques and frameworks. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Experience with SQL and database management. Knowledge of CANBus protocol is a plus
Posted 1 month ago
2.0 - 5.0 years
4 - 7 Lacs
Ahmedabad
Work from Office
About the Company e.l.f. Beauty, Inc. stands with every eye, lip, face and paw. Our deep commitment to clean, cruelty free beauty at an incredible value has fueled the success of our flagship brand e.l.f. Cosmetics since 2004 and driven our portfolio expansion. Today, our multi-brand portfolio includes e.l.f. Cosmetics, e.l.f. SKIN, pioneering clean beauty brand Well People, Keys Soulcare, a groundbreaking lifestyle beauty brand created with Alicia Keys and Naturium, high-performance, biocompatible, clinically-effective and accessible skincare. In our Fiscal year 24, we had net sales of $1 Billion and our business performance has been nothing short of extraordinary with 24 consecutive quarters of net sales growth. We are the #2 mass cosmetics brand in the US and are the fastest growing mass cosmetics brand among the top 5. Our total compensation philosophy offers every full-time new hire competitive pay and benefits, bonus eligibility (200% of target over the last four fiscal years), equity, flexible time off, year-round half-day Fridays, and a hybrid 3 day in office, 2 day at home work environment. We believe the combination of our unique culture, total compensation, workplace flexibility and care for the team is unmatched across not just beauty but any industry. Visit our Career Page to learn more about our team: https://www.elfbeauty.com/work-with-us Position Summary We are seeking a highly skilled Integration Engineer - SAP CPI DS to lead the design, development, and execution of complex data integration and ETL solutions using SAP Cloud Platform Integration (CPI) and SAP Data Services (DS). This is a senior-level, hands-on technical role that requires deep expertise in building robust, scalable integration frameworks across SAP S/4HANA and non-SAP systems. The ideal candidate will have experience in SAP integration, with a strong command of iFlow development, API integrations, and ETL design using SAP DS. You will be responsible for independently managing end-to-end integration projects, ensuring performance, data integrity, and security. A proactive approach to troubleshooting, performance tuning, and system optimization is critical to success in this role. This role offers the opportunity to work on mission-critical integration solutions in a cloud-first, fast-paced environment while collaborating with cross-functional teams and driving innovation in data and system connectivity. Responsibilities Lead the design and development of SAP CPI and SAP Data Services integration solutions with minimal guidance, ensuring alignment with business requirements and technical specifications. Develop, configure, and deploy iFlows within SAP Cloud Platform Integration (CPI) to support business processes and data transformations across SAP and third-party systems. Design and implement ETL (Extract, Transform, Load) processes using SAP Data Services for data migration, integration, and transformation across heterogeneous environments. Take full ownership of integration solutions, ensuring that data flows are optimized, secure, and reliable, with attention to scalability and performance. Troubleshoot, diagnose, and resolve complex issues independently, ensuring continuous uptime and performance optimization of integration processes. Ensure best practices are followed for integration design, coding standards, and error handling across SAP CPI and SAP Data Services. Collaborate with cross-functional teams (e.g., business analysts, functional consultants, and system architects) to define integration requirements, ensuring seamless and efficient business process automation. Independently manage and execute complex integrations for SAP S/4HANA and non-SAP systems with minimal oversight. Monitor and optimize integration solutions for performance, scalability, and reliability, ensuring timely resolution of integration issues. Continuously explore and recommend new tools and techniques for enhancing SAP CPI and DS integration solutions and improving overall system performance. Create comprehensive technical documentation for integration processes, including architecture diagrams, detailed specifications, troubleshooting guides, and knowledge transfer materials. Ensure compliance with organizational security and governance standards, particularly regarding data integrity, privacy, and security in integration scenarios. Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. 12+ years of experience in SAP integration development, including extensive hands-on experience with SAP Cloud Platform Integration (CPI) and SAP Data Services (DS). Proven track record of independently managing end-to-end integration projects with SAP CPI and SAP Data Services. Strong expertise in developing iFlows, API integration, and handling various integration protocols (SOAP, REST, OData, IDocs, XML, JSON). Deep knowledge of SAP Data Services, including ETL processes, job design, and performance optimization. Experience with SAP S/4HANA integrations. Ability to work independently and take ownership of integration solutions with little to no supervision. Expertise in troubleshooting and resolving complex integration issues, identifying bottlenecks, and ensuring smooth operation. Strong understanding of cloud architecture and integration patterns, particularly within the SAP Cloud Platform ecosystem. Proven ability to deliver integration solutions in a timely and efficient manner, while maintaining high quality and reliability. Excellent problem-solving skills with the ability to navigate complex technical challenges autonomously. Strong communication skills, with the ability to interact effectively with business stakeholders, technical teams, and management. Preferred Skills: Certification in SAP Cloud Platform Integration (CPI) or SAP Data Services. Hands-on experience with SAP Integration Suite (API Management, Open Connectors, etc.). Familiarity with SAP Business Technology Platform (BTP) and Cloud Foundry. Experience with API management and containerized deployments (e.g., Kubernetes, Docker). Knowledge of advanced integration tools and frameworks for large-scale enterprise environments. Minimum experience 12 Maximum experience 18 This job description is intended to describe the general nature and level of work being performed in this position. It also reflects the general details considered necessary to describe the principal functions of the job identified, and shall not be considered, as detailed description of all the work required inherent in the job. It is not an exhaustive list of responsibilities, and it is subject to changes and exceptions at the supervisors’ discretion. e.l.f. Beauty respects your privacy. Please see our Job Applicant Privacy Notice (www.elfbeauty.com/us-job-applicant-privacy-notice) for how your personal information is used and shared.
Posted 1 month ago
3.0 - 8.0 years
3 - 6 Lacs
Bengaluru
Work from Office
We are looking for a skilled SQL PySpark professional with 3 to 8 years of experience to join our team. The ideal candidate will have expertise in developing data pipelines and transforming data using Databricks, Synapse notebooks, and Azure Data Factory. Roles and Responsibility Collaborate with technical architects and cloud solutions teams to design data pipelines, marts, and reporting solutions. Code, test, and optimize Databricks jobs for efficient data processing and report generation. Set up scalable data pipelines integrating with various data sources and cloud platforms using Databricks. Ensure best practices are followed in terms of code quality, data security, and scalability. Participate in code and design reviews to maintain high development standards. Optimize data querying layers to enhance performance and support analytical requirements. Leverage Databricks to set up scalable data pipelines that integrate with a variety of data sources and cloud platforms. Collaborate with data scientists and analysts to support machine learning workflows and analytic needs. Stay updated with the latest developments in Databricks and associated technologies to drive innovation. Job Proficiency in PySpark or Scala and SQL for data processing tasks. Hands-on experience with Azure Databricks, Delta Lake, Delta Live tables, Auto Loader, and Databricks SQL. Expertise with Azure Data Lake Storage (ADLS) Gen2 for optimized data storage and retrieval. Strong knowledge of data modeling, ETL processes, and data warehousing concepts. Experience with Power BI for dashboarding and reporting is a plus. Familiarity with Azure Synapse for analytics and integration tasks is desirable. Knowledge of Spark Streaming for real-time data stream processing is an advantage. MLOps knowledge for integrating machine learning into production workflows is beneficial. Familiarity with Azure Resource Manager (ARM) templates for infrastructure as code (IaC) practices is preferred. Demonstrated expertise of 4-5 years in developing data ingestion and transformation pipelines using Databricks, Synapse notebooks, and Azure Data Factory. Solid understanding and hands-on experience with Delta tables, Delta Lake, and Azure Data Lake Storage Gen2. Experience in efficiently using Auto Loader and Delta Live tables for seamless data ingestion and transformation. Proficiency in building and optimizing query layers using Databricks SQL. Demonstrated experience integrating Databricks with Azure Synapse, ADLS Gen2, and Power BI for end-to-end analytics solutions. Prior experience in developing, optimizing, and deploying Power BI reports. Familiarity with modern CI/CD practices, especially in the context of Databricks and cloud-native solutions.
Posted 1 month ago
5.0 - 10.0 years
4 - 8 Lacs
Noida
Work from Office
We are looking for a skilled Senior Azure Data Engineer with 5 to 10 years of experience to design and implement scalable data pipelines using Azure technologies, driving data transformation, analytics, and machine learning. The ideal candidate will have a strong background in data engineering and proficiency in Python, PySpark, and Spark Pools. Roles and Responsibility Design and implement scalable Databricks data pipelines using PySpark. Transform raw data into actionable insights through data analysis and machine learning. Build, deploy, and maintain machine learning models using MLlib or TensorFlow. Optimize cloud data integration from Azure Blob Storage, Data Lake, and SQL/NoSQL sources. Execute large-scale data processing using Spark Pools and fine-tune configurations for efficiency. Collaborate with cross-functional teams to identify business requirements and develop solutions. Job Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Minimum 5 years of experience in data engineering, with at least 3 years specializing in Azure Databricks, PySpark, and Spark Pools. Proficiency in Python, PySpark, Pandas, NumPy, SciPy, Spark SQL, DataFrames, RDDs, Delta Lake, Databricks Notebooks, and MLflow. Hands-on experience with Azure Data Lake, Blob Storage, Synapse Analytics, and other relevant technologies. Strong understanding of data modeling, data warehousing, and ETL processes. Experience with agile development methodologies and version control systems.
Posted 1 month ago
7.0 - 8.0 years
3 - 6 Lacs
Noida
Work from Office
We are looking for a skilled SSAS Developer with strong experience in building OLAP and Tabular models using SQL Server Analysis Services (SSAS). The ideal candidate should have advanced knowledge of ETL processes, particularly with tools such as SSIS, Informatica, or Azure Data Factory. This position requires 7 to 8 years of experience. Roles and Responsibility Design, develop, and maintain SSAS OLAP cubes and Tabular models. Collaborate with business analysts and data architects to gather requirements. Develop complex DAX and MDX queries for reporting and analysis purposes. Optimize SSAS models for performance and scalability. Create and maintain ETL workflows and pipelines for data extraction, transformation, and loading from various sources. Integrate data from relational and non-relational data sources. Implement best practices in data modeling, version control, and deployment automation. Troubleshoot and resolve issues related to data quality, performance, and availability. Work with BI tools such as Power BI, Excel, or Tableau for data visualization and dashboarding. Job Strong hands-on experience with SSAS, both Tabular and Multidimensional models. Expertise in writing DAX and MDX queries. Advanced ETL knowledge using tools like SSIS, Informatica, Azure Data Factory, or similar. Proficient in SQL and performance tuning. Experience with dimensional modeling and star schema design. Familiarity with source control tools like Git and CI/CD pipelines for BI projects. Knowledge of data warehousing concepts and data governance. Strong analytical and problem-solving skills. Bachelor's degree in Computer Science, Information Systems, or a related field.
Posted 1 month ago
10.0 - 15.0 years
3 - 7 Lacs
Bengaluru
Work from Office
We are looking for a skilled MDM Engineer with extensive experience in Informatica MDM to join our team. The ideal candidate will be responsible for designing, developing, and maintaining our Master Data Management (MDM) solutions to ensure data accuracy, consistency, and reliability across the organization. This role requires 10-15 years of experience. Roles and Responsibility Design and implement MDM solutions using Informatica MDM, ensuring alignment with business requirements and data governance standards. Develop and manage ETL processes to integrate data from various sources into the MDM system. Implement data quality rules and processes to ensure the accuracy and consistency of master data. Configure Informatica MDM Hub, including data modeling, data mappings, match and merge rules, and user exits. Monitor and optimize the performance of MDM solutions, ensuring high availability and reliability. Collaborate with data stewards, business analysts, and other stakeholders to gather requirements and ensure the MDM solution meets their needs. Create and maintain comprehensive documentation for MDM processes, configurations, and best practices. Troubleshoot issues related to MDM processes and systems. Job Minimum 10 years of hands-on experience in MDM design, development, and support using Informatica MDM. Proficiency in Informatica MDM ETL processes and data integration technologies. Strong understanding of data governance, data quality, and master data management principles. Excellent problem-solving and analytical skills with the ability to troubleshoot complex data issues. Strong communication and interpersonal skills with the ability to collaborate effectively with cross-functional teams. Experience in Employment Firms/Recruitment Services Firms industry is preferred.
Posted 1 month ago
6.0 - 8.0 years
4 - 8 Lacs
Noida
Work from Office
We are looking for a skilled Senior Data Warehouse Analyst to join our team at Apptad Technologies Pvt Ltd. The ideal candidate will have 6 to 8 years of experience in data analysis and management, with expertise in working with large datasets. Roles and Responsibility Design, develop, and implement data warehouse solutions to meet business requirements. Analyze complex data sets to identify trends, patterns, and insights. Develop and maintain databases, data models, and ETL processes. Collaborate with cross-functional teams to integrate data from various sources. Ensure data quality, integrity, and security. Optimize database performance and troubleshoot issues. Job Strong knowledge of data warehousing concepts, including star schema design. Experience with relational databases such as Oracle or SQL Server. Proficiency in programming languages like Python or R. Excellent analytical and problem-solving skills. Ability to work independently and collaboratively as part of a team. Strong communication and interpersonal skills. Ref6566417
Posted 1 month ago
4.0 - 8.0 years
4 - 7 Lacs
Noida
Work from Office
We are looking for a skilled SSAS Data Engineer with 4 to 8 years of experience to join our team. The ideal candidate will have a strong background in Computer Science, Information Technology, or a related field. Roles and Responsibility Develop, deploy, and manage OLAP cubes and tabular models. Collaborate with data teams to design and implement effective data solutions. Troubleshoot and resolve issues related to SSAS and data models. Monitor system performance and optimize queries for efficiency. Implement data security measures and backup procedures. Stay updated with the latest SSAS and BI technologies and best practices. Job Bachelor's degree in Computer Science, Information Technology, or a related field. Strong understanding of data warehousing, ETL processes, OLAP concepts, and data modeling concepts. Proficiency in SQL, MDX, and DAX query languages. Experience with data visualization tools like Power BI. Excellent problem-solving skills and attention to detail. Strong communication and collaboration abilities. Experience working in an Agile environment.
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Noida
Work from Office
company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=5 to 10 , jd= Job Title:-SQL+ ADF Job Location:- Gurgaon Job Type:- Full Time JD:- Strong exp in SQL developmentalong with exp in cloud AWS & good exp in ADF Job Summary : We are looking for a skilled SQL + Azure Data Factory (ADF) Developer to join our data engineering team. The ideal candidate will have strong experience in writing complex SQL queries, developing ETL pipelines using Azure Data Factory, and integrating data from multiple sources into cloud-based data solutions. This role will support data warehousing, analytics, and business intelligence initiatives. Key Responsibilities : Design, develop, and maintain data integration pipelines using Azure Data Factory (ADF) . Write optimized and complex SQL queries , stored procedures, and functions for data transformation and reporting. Extract data from various structured and unstructured sources and load into Azure-based data platforms (e.g., Azure SQL Database, Azure Data Lake). Schedule and monitor ADF pipelines, ensuring data quality, accuracy, and availability. Collaborate with data analysts, data architects, and business stakeholders to gather requirements and deliver solutions. Troubleshoot data issues and implement corrective actions to resolve pipeline or data quality problems. Implement and maintain data lineage, metadata, and documentation for pipelines. Participate in code reviews, performance tuning, and optimization of ETL processes. Ensure compliance with data governance, privacy, and security standards. Hands-on experience with T-SQL / SQL Server . Experience working with Azure Data Factory (ADF) and Azure SQL . Strong understanding of ETL processes , data warehousing concepts , and cloud data architecture . Experience working with Azure services such as Azure Data Lake, Blob Storage, and Azure Synapse Analytics (preferred). Familiarity with Git/DevOps CI/CD pipelines for ADF deployments is a plus. Excellent problem-solving, analytical, and communication skills. , Title=SQL+ ADF, ref=6566294
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France