Home
Jobs

244 Data Transformation Jobs - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

14 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education

Posted 3 days ago

Apply

8.0 - 10.0 years

12 - 18 Lacs

Lucknow

Remote

Naukri logo

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 3 days ago

Apply

8.0 - 10.0 years

12 - 18 Lacs

Ludhiana

Remote

Naukri logo

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 3 days ago

Apply

8.0 - 10.0 years

12 - 18 Lacs

Hyderabad

Remote

Naukri logo

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 3 days ago

Apply

5.0 - 8.0 years

20 - 25 Lacs

Mohali, Pune

Work from Office

Naukri logo

Exp with Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, Azure Storage, SQL, Git, CI/CD, Azure DevOps, RESTful APIs, Data APIs, event-driven architecture, data governance, lineage, security, privacy best practices. Immediate Joiners Required Candidate profile Data Warehousing, Data Lake, Azure Cloud Services, Azure DevOps ETL-SSIS, ADF, Synapse, SQL Server, Azure SQL Data Transformation, Modelling, Integration Microsoft Certified: Azure Data Engineer

Posted 3 days ago

Apply

6.0 - 11.0 years

11 - 18 Lacs

Noida, Greater Noida, Delhi / NCR

Work from Office

Naukri logo

Responsibilities: Design, develop, and maintain data pipelines and ETL processes using Domo. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Implement data transformation and data warehousing solutions to support business intelligence and analytics. Optimize and troubleshoot data workflows to ensure efficiency and reliability. Develop and maintain documentation for data processes and systems. Ensure data quality and integrity through rigorous testing and validation. Monitor and manage data infrastructure to ensure optimal performance. Stay updated with industry trends and best practices in data engineering and Domo. Mandatory Skills- Domo, Data Transformation Layer (SQL, Python), Data Warehouse Layer (SQL, Python) Requirements: Bachelor's degree in computer science, Information Technology, or related field. Proven experience as a Data Engineer, with a strong focus on data transformation and data warehousing. Proficiency in Domo and its various tools and functionalities. Experience with SQL, Python, and other relevant programming languages. Strong understanding of ETL processes and data pipeline architecture. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Strong communication skills to collaborate effectively with stakeholders. Preferred Qualifications: -Knowledge of data visualization and reporting tools. -Familiarity with Agile methodologies and project management tools. - Data Transformation Layer (SQL, Python) -Data Warehouse Layer (SQL, Python) Share your resume over Aarushi.Shukla@coforge.com

Posted 3 days ago

Apply

5.0 - 7.0 years

12 - 15 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Naukri logo

We are looking for a skilled Data Engineer with expertise in SSIS, Tableau, SQL, and ETL processes. The ideal candidate should have experience in Data Modeling, Data Pipelines, and Agile methodologies. Responsibilities include designing and maintaining data pipelines, implementing ETL processes using SSIS, optimizing data models for reporting, and developing advanced dashboards in Tableau. The role requires proficiency in SQL for complex data transformations, troubleshooting data workflows, and ensuring data integrity and compliance. Strong problem-solving skills, Agile collaboration experience, and the ability to work independently in a remote setup are essential. Location-Remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 3 days ago

Apply

6.0 - 10.0 years

13 - 14 Lacs

Jaipur, Delhi / NCR, Bengaluru

Hybrid

Naukri logo

Location: DELHI / NCR / Jaipur / Bangalore / Hyderabad Work Mode: Hybrid - 2 Days WFO Working Time: 1:00 PM to 10:00 PM IST iSource Services is hiring for one of their USA based client for the position of Data Integration Specialist. About the Role - We are seeking a skilled Data Integration Specialist to manage data ingestion, unification, and activation across Salesforce Data Cloud and other platforms. You will design and implement robust integration workflows, leveraging APIs and ETL tools to enable seamless data flow and support a unified customer experience. Key Responsibilities: Design and implement data ingestion workflows into Salesforce Data Cloud Unify data from multiple sources to create a 360 customer view Develop integrations using APIs, ETL tools, and middleware (e.g., MuleSoft) Collaborate with cross-functional teams to gather and fulfill data integration requirements Monitor integration performance and ensure real-time data availability Ensure compliance with data privacy and governance standards Enable data activation across Salesforce Marketing, Sales, and Service Clouds Must-Have Skills: Experience with cloud data platforms (e.g., Snowflake, Redshift, BigQuery) Salesforce certifications (e.g., Data Cloud Consultant, Integration Architect) Hands-on experience with Salesforce Data Cloud (CDP) Proficiency in ETL, data transformation, and data mapping Strong knowledge of REST/SOAP APIs and integration tools Solid understanding of data modeling and customer data platforms Familiarity with data privacy regulations (e.g., GDPR, CCPA).

Posted 6 days ago

Apply

4.0 - 7.0 years

10 - 15 Lacs

Mumbai

Work from Office

Naukri logo

We are seeking a skilled Business Intelligence Manager to construct and uphold analytics and reporting solutions that convert data into actionable insights. The BI Manager role is pivotal, involving the conversion of provided data into meaningful insights through user-friendly dashboards and reports. An ideal BI Manager possesses proficiency in Business Intelligence tools and technology, overseeing the creation and administration of BI tools with comprehensive knowledge of the BI system, managing stakeholder expectations and ensuring we deliver to that expectation as a team. This role demands a grasp of business concepts, strong problem-solving abilities, and prior experience in data and business analysis. Analytical prowess and effective communication skills are highly valued attributes for this position.. BI Responsibilities. The day-to-day responsibilities include but not limited to:. Develop actionable insights that can be used to make business decisions by building reports and dashboards.. Understand business stakeholders’ objectives, metrics that are most important to them, and how they measure performance.. Translate data into highly leveraged and effective visualizations. Share knowledge and skills with your teammates to grow analytics impact. Ability to come up with an overall design strategy for all analytics that improves the user experience. Influence and educate stakeholders on the appropriate data, tools, and visualizations.. Review all analytics for quality before final output are delivered to stakeholders.. Responsibly for version control and creating technical documentation.. Partner with IT to provide different ways of improving on existing processes.. Successful contribution to delivery through the development and implementation of best-in-class data visualization and insights. Strong relationship with the business stakeholders to ensure understanding of business needs.. Improvement in performance for all visualizations due to optimized code. Experience with custom/ third party visuals. Design, implement, and maintain scalable data pipelines and architectures. Essential Traits. Qualifications/Skills:. Graduate or equivalent level qualification, preferably in a related discipline; Master’s degree preferred. 6-8 years of analytical experience in Data and Analytics: Building reports and dashboards. 6-8 years of experience with visualization tools such as Power BI. Hands on experience in DAX, Power Query, SQL and build data models that can generate meaningful insights.. Experience working with and creating analytics to enable stakeholders for data driven decision making. 4+ years of experience with requirements gathering.. Should have expert level proficiency in data transformation/configuration and connecting the data with the Power BI dashboard.. Exposure in implementing row-level security and bookmarks.. Competencies. Highly motivated and influential team player with a proven track record of driving results.. Strong communicator and collaborator with exceptional interpersonal skills.. Analytical problem-solver with a passion for innovation and continuous improvement.. Teachable, embraces best practices, and leverages feedback as a means of continuous improvement.. Consistently high achiever marked by perseverance, humility, and a positive outlook in the face of challenges.. Strong problem solving, quantitative and analytical abilities.. Solid written and verbal communication skills and knowledge to build strong relationships. Preferred. Microsoft/ Any other BI Certified. About Kroll. In a world of disruption and increasingly complex business challenges, our professionals bring truth into focus with the Kroll Lens. Our sharp analytical skills, paired with the latest technology, allow us to give our clients clarity—not just answers—in all areas of business. We value the diverse backgrounds and perspectives that enable us to think globally. As part of One team, One Kroll, you’ll contribute to a supportive and collaborative work environment that empowers you to excel. Kroll is the premier global valuation and corporate finance advisor with expertise in complex valuation, disputes and investigations, M&A, restructuring, and compliance and regulatory consulting. Our professionals balance analytical skills, deep market insight and independence to help our clients make sound decisions. As an organization, we think globally—and encourage our people to do the same.. Kroll is committed to equal opportunity and diversity, and recruits people based on merit.. In order to be considered for a position, you must formally apply via careers.kroll.com. Show more Show less

Posted 6 days ago

Apply

3.0 - 4.0 years

11 - 14 Lacs

Mumbai

Work from Office

Naukri logo

AEP Data Architect. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Key Responsibilities. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Requirements & Qualifications. 10+ years of strong experience with data transformation & ETL on large data sets. Experience with designing customer centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale etc.). 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). 5+ years of complex SQL or NoSQL experience. Experience in advanced Data Warehouse concepts. Experience in industry ETL tools (i.e., Informatica, Unifi). Experience with Business Requirements definition and management, structured analysis, process design, use case documentation. Experience with Reporting Technologies (i.e., Tableau, PowerBI). Experience in professional software development. Demonstrate exceptional organizational skills and ability to multi-task simultaneous different customer projects. Strong verbal & written communication skills to interface with Sales team & lead customers to successful outcome. Must be self-managed, proactive and customer focused. Degree in Computer Science, Information Systems, Data Science, or related field. Show more Show less

Posted 6 days ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

ODI Developer Chennai/Bangalore. WFO only. 5-8 years of experience as an ETL Developer, with hands-on expertise in Oracle Data Integrator (ODI). ODI expertise is must, ETL with informatica and any other tools except ODI will be rejected. Proficiency in Oracle Database and MySQL, with strong skills in SQL and PL/SQL development. Experience in data integration, transformation, and loading from heterogeneous data sources. Strong understanding of data modeling concepts and ETL best practices. Familiarity with performance tuning and troubleshooting of ETL processes. Knowledge of scripting languages (e.g., Python, JavaScript) for automation is a plus. Excellent analytical and problem-solving skills. Strong communication skills to work effectively with cross-functional teams. Please call varsha 7200847046 for more Info Regards varsha 7200847046

Posted 1 week ago

Apply

5.0 - 8.0 years

18 - 25 Lacs

Pune

Work from Office

Naukri logo

We are seeking an experienced Modern Microservice Developer to join our team and contribute to the design, development, and optimization of scalable microservices and data processing workflows. The ideal candidate will have expertise in Python, containerization, and orchestration tools, along with strong skills in SQL and data integration. Key Responsibilities: Develop and optimize data processing workflows and large-scale data transformations using Python. Write and maintain complex SQL queries in Snowflake to support efficient data extraction, manipulation, and aggregation. Integrate diverse data sources and perform validation testing to ensure data accuracy and integrity. Design and deploy containerized applications using Docker, ensuring scalability and reliability. Build and maintain RESTful APIs to support microservices architecture. Implement CI/CD pipelines and manage orchestration tools such as Kubernetes or ECS for automated deployments. Monitor and log application performance, ensuring high availability and quick issue resolution. Requirements Mandatory: Bachelor's degree in Computer Science, Engineering, or a related field. 5-8 years of experience in Python development, with a focus on data processing and automation. Proficiency in SQL, with hands-on experience in Snowflake. Strong experience with Docker and containerized application development. Solid understanding of RESTful APIs and microservices architecture. Familiarity with CI/CD pipelines and orchestration tools like Kubernetes or ECS. Knowledge of logging and monitoring tools to ensure system health and performance. Preferred Skills: Experience with cloud platforms (AWS, Azure, or GCP) is a plus.

Posted 1 week ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Naukri logo

Dear Candidate, We are hiring a Cloud Data Scientist to build and scale data science solutions in cloud-native environments. Ideal for candidates who specialize in analytics and machine learning using cloud ecosystems. Key Responsibilities: Design predictive and prescriptive models using cloud ML tools Use BigQuery, SageMaker, or Azure ML Studio for scalable experimentation Collaborate on data sourcing, transformation, and governance in the cloud Visualize insights and present findings to stakeholders Required Skills & Qualifications: Strong Python/R skills and experience with cloud ML stacks (AWS, GCP, or Azure) Familiarity with cloud-native data warehousing and storage (Redshift, BigQuery, Data Lake) Hands-on with model deployment, CI/CD, and A/B testing in the cloud Bonus: Background in NLP, time series, or geospatial analysis Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 9 Lacs

Mumbai Suburban

Work from Office

Naukri logo

Job Title: Data Processing (DP) Executive Location: MIDC, Andheri East, Mumbai Work Mode: Work From Office (WFO) Work Days: Monday to Friday Work Hours: 9:00 PM 6:00 AM IST (Night Shift) Job Summary: We are seeking a highly skilled and detail-oriented Data Processing (DP) Executive to join our team. The ideal candidate will have a solid background in data analysis and processing, strong proficiency in industry-standard tools, and the ability to manage large data sets efficiently. This role is critical in ensuring data integrity and delivering accurate insights for business decision-making. Key Responsibilities: Manage and process data using tools like SPSS and Q programming . Perform data cleaning, transformation, and statistical analysis. Collaborate with research and analytics teams to interpret and format data for reporting. Create reports and dashboards; experience with Tableau or similar visualization tools is an advantage. Utilize SQL for data querying and validation. Ensure accuracy and consistency of data deliverables across projects. Handle multiple projects simultaneously with a keen eye for detail and timelines. Technical Skills: Proficiency in SPSS and Q programming . Strong understanding of data processing techniques and statistical methods. Familiarity with Tableau or other data visualization tools (preferred). Basic working knowledge of SQL . Educational Qualifications: Bachelor's degree in Statistics, Computer Science, Data Science , or a related field. Experience: Minimum 3 years of experience in data processing or a similar analytical role. Soft Skills: Excellent analytical and problem-solving abilities. Strong attention to detail and accuracy. Good communication skills and the ability to work in a team-oriented environment. Self-motivated with the ability to work independently and manage multiple tasks effectively.

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

What You'll Do We are seeking a highly skilled and motivated Senior Data Engineer to join our Data Operations team The ideal candidate will have deep expertise in Python, Snowflake SQL, modern ETL tools, and business intelligence platforms such as Power BI This role also requires experience integrating SaaS applications such as Salesforce, Zuora, and NetSuite using REST APIs You will be responsible for building and maintaining data pipelines, developing robust data models, and ensuring seamless data integrations that support business analytics and reporting The role requires flexibility to collaborate in US time zones as needed, What Your Responsibilities Will Be Design, develop, and maintain scalable data pipelines and workflows using modern ETL tools and Python, Build and optimize SQL queries and data models on Snowflake to support analytics and reporting needs, Integrate with SaaS platforms such as Salesforce, Zuora, and NetSuite using APIs or native connectors, Develop and support dashboards and reports using Power BI and other reporting tools, Work closely with data analysts, business users, and other engineering teams to gather requirements and deliver high-quality solutions, Ensure data quality, accuracy, and consistency across systems and datasets, Write clean, well-documented, and testable code with a focus on performance and reliability, Participate in peer code reviews and contribute to best practices in data engineering, Be available for meetings and collaboration in US time zones as required, What Youll Need To Be Successful You should have 5+ years' experience in data engineering field, with deep SQL knowledge, Strong experience in Snowflake SQL, Python, AWS Services, Power BI, ETL Tools (DBT, Airflow) is must, Proficiency in Python for data transformation and scripting, Proficiency in writing complex SQL queries, Stored Procedures, Strong experience in Data Warehouse, data modeling and ETL design concepts, Should have integrated SaaS systems like Salesforce, Zuora, NetSuite along with Relational Databases, REST API, FTP/SFTP, etc Knowledge of AWS technologies (EC2, S3, RDS, Redshift, etc ) Excellent communication skills, with the ability to translate technical issues for non-technical stakeholders, Flexibility to work during US business hours as required for team meetings and collaboration, How Well Take Care Of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses, Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance, Inclusive culture and diversity Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship, What You Need To Know About Avalara Were Avalara Were defining the relationship between tax and tech, Weve already built an industry-leading cloud compliance platform, processing nearly 40 billion customer API calls and over 5 million tax returns a year, and this year we became a billion-dollar business Our growth is real, and were not slowing down until weve achieved our mission to be part of every transaction in the world, Were bright, innovative, and disruptive, like the orange we love to wear It captures our quirky spirit and optimistic mindset It shows off the culture weve designed, that empowers our people to win Ownership and achievement go hand in hand here We instill passion in our people through the trust we place in them, Weve been different from day one Join us, and your career will be too, Were An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company ? we dont want people to fit into our culture, but to enrich it All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law If you require any reasonable adjustments during the recruitment process, please let us know,

Posted 1 week ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Naukri logo

Dear Candidate, We are hiring a Data Engineering Manager to lead a team building data pipelines, models, and analytics infrastructure. Ideal for experienced engineers who can manage both technical delivery and team growth. Key Responsibilities: Lead development of ETL/ELT pipelines and data platforms Manage data engineers and collaborate with analytics/data science teams Architect systems for data ingestion, quality, and warehousing Define best practices for data architecture, testing, and monitoring Required Skills & Qualifications: Strong experience with big data tools (Spark, Kafka, Airflow) Proficiency in SQL, Python, and cloud data services (e.g., Redshift, BigQuery) Proven leadership and team management in data engineering contexts Bonus: Experience with real-time streaming and ML pipeline integration Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 week ago

Apply

4.0 - 7.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description Summary The Data Scientist will work in teams addressing statistical, machine learning and data understanding problems in a commercial technology and consultancy development environment. In this role, you will contribute to the development and deployment of modern machine learning, operational research, semantic analysis, and statistical methods for finding structure in large data sets. Job Description Site Overview Established in 2000, the John F. Welch Technology Center (JFWTC) in Bengaluru is GE Aerospaces multidisciplinary research and engineering center. Pushing the boundaries of innovation every day, engineers and scientists at JFWTC have contributed to hundreds of aviation patents, pioneering breakthroughs in engine technologies, advanced materials, and additive manufacturing. Role Overview: As a Data Scientist, you will be part of a data science or cross-disciplinary team on commercially-facing development projects, typically involving large, complex data sets. These teams typically include statisticians, computer scientists, software developers, engineers, product managers, and end users, working in concert with partners in GE business units. Potential application areas include remote monitoring and diagnostics across infrastructure and industrial sectors, financial portfolio risk assessment, and operations optimization. In this role, you will: Develop analytics within well-defined projects to address customer needs and opportunities. Work alongside software developers and software engineers to translate algorithms into commercially viable products and services. Work in technical teams in development, deployment, and application of applied analytics, predictive analytics, and prescriptive analytics. Perform exploratory and targeted data analyses using descriptive statistics and other methods. Work with data engineers on data quality assessment, data cleansing and data analytics Generate reports, annotated code, and other projects artifacts to document, archive, and communicate your work and outcomes. Share and discuss findings with team members. Required Qualifications: Bachelor's Degree in Computer Science or STEM Majors (Science, Technology, Engineering and Math) with basic experience. Desired Characteristics: - Expertise in one or more programming languages and analytic software tools (e.g., Python, R, SAS, SPSS). Strong understanding of machine learning algorithms, statistical methods, and data processing techniques. - Exceptional ability to analyze large, complex data sets and derive actionable insights. Proficiency in applying descriptive, predictive, and prescriptive analytics to solve real-world problems. - Demonstrated skill in data cleansing, data quality assessment, and data transformation. Experience working with big data technologies and tools (e.g., Hadoop, Spark, SQL). - Excellent communication skills, both written and verbal. Ability to convey complex technical concepts to non-technical stakeholders and collaborate effectively with cross-functional teams - Demonstrated commitment to continuous learning and staying up-to-date with the latest advancements in data science, machine learning, and related fields. Active participation in the data science community through conferences, publications, or contributions to open-source projects. - Ability to thrive in a fast-paced, dynamic environment and adapt to changing priorities and requirements. Flexibility to work on diverse projects across various domains. Preferred Qualifications: - Awareness of feature extraction and real-time analytics methods. - Understanding of analytic prototyping, scaling, and solutions integration. - Ability to work with large, complex data sets and derive meaningful insights. - Familiarity with machine learning techniques and their application in solving real-world problems. - Strong problem-solving skills and the ability to work independently and collaboratively in a team environment. - Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Domain Knowledge: Demonstrated awareness of industry and technology trends in data science Demonstrated awareness of customer and stakeholder management and business metrics Leadership: Demonstrated awareness of how to function in a team setting Demonstrated awareness of critical thinking and problem solving methods Demonstrated awareness of presentation skills Personal Attributes: Demonstrated awareness of how to leverage curiosity and creativity to drive business impact Humble: respectful, receptive, agile, eager to learn Transparent: shares critical information, speaks with candor, contributes constructively Focused: quick learner, strategically prioritizes work, committed Leadership ability: strong communicator, decision-maker, collaborative Problem solver: analytical-minded, challenges existing processes, critical thinker Whether we are manufacturing components for our engines, driving innovation in fuel and noise reduction, or unlocking new opportunities to grow and deliver more productivity, our GE Aerospace teams are dedicated and making a global impact. Join us and help move the aerospace industry forward . Additional Information Relocation Assistance Provided: No

Posted 1 week ago

Apply

4.0 - 6.0 years

12 - 18 Lacs

Noida, Greater Noida

Work from Office

Naukri logo

Role & responsibilities Utilize Python (specifically Pandas) to clean, transform, and analyze data, automate repetitive tasks, and create custom reports and visualizations. Analyze and interpret complex datasets, deriving actionable insights to support business decisions. Write and optimize advanced SQL queries for data extraction, manipulation, and analysis from various sources, including relational databases and cloud-based data storage. Collaborate with cross-functional teams to understand data needs and deliver data-driven solutions. Create and maintain dashboards and reports that visualize key metrics and performance indicators. Identify trends, patterns, and anomalies in data to support business intelligence efforts and provide strategic recommendations. Ensure data integrity and accuracy by developing and implementing data validation techniques. Support data migration, transformation, and ETL processes within cloud environments. Requirements 3 - 5 years of experience as a Data analyst or equivalent role. Good experience in Python, with hands-on experience using Pandas for data analysis and manipulation. Expertise in analytical SQL, including writing complex queries for data extraction, aggregation, and transformation. Knowledge of cloud platforms, particularly AWS (Amazon Web Services). Strong analytical thinking, problem-solving, and troubleshooting abilities. Familiarity with data visualization tools (e.g., Tableau, Power BI, Quicksight , Superset etc.) is a plus. Excellent communication skills, with the ability to explain complex data insights in a clear and actionable manner. Detail-oriented with a focus on data quality and accuracy. Preferred Qualifications: Experience working in a cloud-based data analytics environment. Familiarity with additional cloud services and tools (e.g. Snowflake , Athena). Experience working in an Agile environment or with data-oriented teams.

Posted 1 week ago

Apply

5.0 - 10.0 years

18 - 25 Lacs

Bengaluru

Remote

Naukri logo

Job Title: Data Engineer ETL & Spatial Data Expert Locations: Bengaluru / Gurugram / Nagpur / Remote Department: Data Engineering / GIS / ETL Experience: As per requirement (CTC capped at 3.5x of experience in years) Notice Period: Max 30 days Role Overview: We are looking for a detail-oriented and technically proficient Data Engineer with strong experience in FME, spatial data handling , and ETL pipelines . The role involves building, transforming, validating, and automating complex geospatial datasets and dashboards to support operational and analytical needs. Candidates will work closely with internal teams, local authorities (LA), and HMLR specs. Key Responsibilities: 1. Data Integration & Transformation Build ETL pipelines using FME to ingest and transform data from Idox/CCF systems. Create Custom Transformers in FME to apply reusable business rules. Use Python (standalone or within FME) for custom transformations, date parsing, and validations. Conduct data profiling to assess completeness, consistency, and accuracy. 2. Spatial Data Handling Manage and query spatial datasets using PostgreSQL/PostGIS . Handle spatial formats like GeoPackage, GML, GeoJSON, Shapefiles . Fix geometry issues like overlaps or invalid polygons using FME or SQL . Ensure proper coordinate system alignment (e.g., EPSG:27700). 3. Automation & Workflow Orchestration Use FME Server/FME Cloud to automate and monitor ETL workflows. Schedule batch processes via CI/CD, Cron, or Python . Implement audit trails and logs for all data processes and rule applications. 4. Dashboard & Reporting Integration Write SQL views and aggregations to support dashboard visualizations. Optionally integrate with Power BI, Grafana, or Superset . Maintain metadata tagging for each data batch. 5. Collaboration & Communication Interpret validation reports and collaborate with Analysts/Ops teams. Translate business rules into FME logic or SQL queries. Map data to LA/HMLR schemas accurately. Preferred Tools & Technologies: CategoryToolsETLFME (Safe Software), Talend (optional), PythonSpatial DBPostGIS, Oracle SpatialGIS ToolsQGIS, ArcGISScriptingPython, SQLValidationFME Testers, AttributeValidator, SQL viewsFormatsCSV, JSON, GPKG, XML, ShapefilesCollaborationJira, Confluence, Git Ideal Candidate Profile: Strong hands-on experience with FME workflows and spatial data transformation . Proficient in scripting using Python and working with PostGIS . Demonstrated ability to build scalable data automation pipelines. Effective communicator capable of converting requirements into technical logic. Past experience with LA or HMLR data specifications is a plus. Required Qualifications: B.E./B.Tech. (Computer Science, IT, or ECE) B.Sc. (IT/CS) or Full-time MCA Strict Screening Criteria: No employment gaps over 4 months. Do not consider candidates from Jawaharlal Nehru University. Exclude profiles from Hyderabad or Andhra Pradesh (education or employment). Reject profiles with BCA, B.Com, Diploma, or open university backgrounds. Projects must detail technical tools/skills used clearly. Max CTC is 3.5x of total years of experience. No flexibility on notice period or compensation. No candidates from Noida for Gurugram location.

Posted 1 week ago

Apply

10.0 - 15.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Novo Nordisk Global Business Services ( GBS) India Department - Global Data & Artificial lntelligence Are you passionate about building scalable data pipelines and optimising data workflowsDo you want to work at the forefront of data engineering, collaborating with cross-functional teams to drive innovationIf so, we are looking for a talented Data Engineer to join our Global Data & AI team at Novo Nordisk. Read on and apply today for a life-changing career! The Position As a Senior Data Engineer, you will play a key role in designing, developing, and main-taining data pipelines and integration solutions to support analytics, Artificial Intelligence workflows, and business intelligence. It includes: Design, implement, and maintain scalable data pipelines and integration solutions aligned with the overall data architecture and strategy. Implement data transformation workflows using modern ETL/ELT approaches while establishing best practices for data engineering, including testing methodologies and documentation. Optimize data workflows by harmonizing and securely transferring data across systems, while collaborating with stakeholders to deliver high-performance solutions for analytics and Artificial Intelligence. Monitoring and maintaining data systems to ensure their reliability. Support data governance by ensuring data quality and consistency, while contributing to architectural decisions shaping the data platform's future. Mentoring junior engineers and fostering a culture of engineering excellence. Qualifications Bachelor’s or master’s degree in computer science, Software Development, Engineering. Possess over 10 years of overall professional experience, including more than 4 years of specialized expertise in data engineering. Experience in developing production-grade data pipelines using Python, Data-bricks and Azure cloud, with a strong foundation in software engineering principles. Experience in the clinical data domain, with knowledge of standards such as CDISC SDTM and ADaM (Good to have). Experience working in a regulated industry (Good to have). About the department You will be part of the Global Data & AI team. Our department is globally distributed and has for mission to harness the power of Data and Artificial Intelligence, integrating it seamlessly into the fabric of Novo Nordisk's operations. We serve as the vital link, weaving together the realms of Data and Artificial Intelligence throughout the whole organi-zation, empowering Novo Nordisk to realize its strategic ambitions through our pivotal initiatives. The atmosphere is fast-paced and dynamic, with a strong focus on collaboration and innovation. We work closely with various business domains to create actionable insights and drive commercial excellence.

Posted 1 week ago

Apply

2.0 - 4.0 years

3 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Description of the position/role: Design, develop, and maintain interactive dashboards and reports in Power BI to showcase key performance indicators and business trends. Write complex SQL queries to extract, manipulate, and analyze data from relational databases, ensuring data accuracy and integrity. Develop and implement Macros and VBA scripts to automate repetitive tasks and streamline data processing workflows. Collaborate with cross-functional teams to gather requirements, understand business needs, and translate them into technical specifications for reporting and analytics. Perform data analysis to identify trends, patterns, and anomalies, and provide recommendations based on findings. Ensure timely delivery of reports and analyses to meet business objectives and support decision-making processes. Troubleshoot and resolve any data-related issues, ensuring high-quality data for reporting and analysis. Stay updated on industry trends and best practices related to data visualization and analytics. Requirements: Bachelors degree/Diploma in data science, Information Technology, Business Analytics, or a related field. 2 to 5 years of experience as Data or Business Analyst, Night shift ( 06 :00 PM to 03 :00 AM) Proven experience in data analysis and business intelligence, specifically using Power BI, SQL, and Macro/VBA. Strong understanding of database management systems and ETL processes. Proficient in writing SQL queries for data extraction and manipulation. Experience with Power BI, including DAX functions, data modeling, and report designing. Building relations, parameters and measures and Back-end formatting. Knowledge of Macros and VBA programming to automate tasks within Excel and other applications. Excellent analytical and problem-solving skills, with the ability to interpret data and offer actionable insights. Strong attention to detail and the ability to work independently as well as in a team environment. Good communication skills to effectively present findings and collaborate with stakeholders.

Posted 1 week ago

Apply

6.0 - 9.0 years

27 - 42 Lacs

Pune

Work from Office

Naukri logo

Job Summary We are seeking a Developer with 4 to 9 years of experience to join our team. The ideal candidate will have strong technical skills in experience in integration development using Workato. Key Responsibilities Design and implement robust, reusable, and scalable integrations using Workato Recipes, Connectors, and Workbot. Work closely with business stakeholders, architects, and product teams to understand integration needs and translate them into technical requirements. Develop custom connectors and scripts using JavaScript, HTTP connectors, and Webhook listeners within Workato. Maintain and enhance existing integrations, troubleshoot issues, and ensure high availability and performance. Implement data mapping, transformation, and error handling best practices. Leverage Workato SDK (if needed) to create reusable components and extend platform capabilities. Monitor and optimize recipe performance and perform root cause analysis for failed jobs. Mentor junior developers and contribute to integration governance frameworks and best practice Participate in agile ceremonies, provide input on story estimations, and contribute to technical documentation. Required Skills 2+ years of experience in integration development using Workato. Deep understanding of Workato platform features: Recipes, Recipe Functions,Collections, Lookup Tables, Connections, Jobs, and Logs. Strong experience in REST/SOAP API consumption, authentication (OAuth 2.0, API Keys), and data formats (JSON, XML). Proficiency in SQL, JavaScript, and data transformation logic within integrations. Experience in building custom connectors using Workato Connector SDK (preferred). Solid understanding of error handling, logging, and retry mechanisms. Workato Automation Pro certifications (e.g., Level 1, 2, or Workato Partner Certification)

Posted 1 week ago

Apply

6.0 - 11.0 years

13 - 22 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Aws Glue - Mandatory Aws S3 and AWS lambada - should have some experience Must have used snowpipe to build integration pipelines. how to build procedure from scratch. write complex Sql queries writing complex Sql queries python-Numpy and pandas

Posted 1 week ago

Apply

6.0 - 10.0 years

20 - 35 Lacs

Noida

Work from Office

Naukri logo

Job Title: Solutions Architect Type: Full-Time Location - Sec 63 Noida Salary: Best in Market About the Role:- We are building a Microsoft-focused IT consulting firm serving SMB clients across cloud solutions, cybersecurity, and digital transformation. We are seeking a client-facing, hands-on Solutions Architect who will not only design and deliver technical solutions but also lead projects and manage technical teams . This is a dual-role opportunity: youll be the lead technical consultant on client calls and pre-sales discussions, and also the technical delivery lead , responsible for creating roadmaps and ensuring successful execution using internal and external resources. Prior experience leading technical teams and projects is a must. Key Responsibilities:- Client-Facing & Business Consulting Join client discovery meetings with the sales team to identify pain points and technical needs. Present technical solutions clearly to both technical and non-technical stakeholders. Collaborate with sales on developing SOWs, proposals, and solution estimates. Build and maintain long-term relationships as the trusted technical advisor. Solution Architecture & Delivery Leadership Design and deploy Microsoft-based solutions, including Azure infrastructure, SharePoint, Intune, MFA, and Office 365. Conduct cybersecurity assessments and penetration tests; implement Microsoft Defender and Azure Security Center. Plan and lead seamless Office 365 migrations with strong emphasis on user experience and uptime. Lead technical project teams , allocate tasks, and oversee execution from kickoff through post-implementation. Develop and maintain project roadmaps , timelines, and delivery milestones aligned with client goals. Be hands-on when needed and help resolve complex issues directly or by guiding team members. Build and manage a trusted network of independent technical experts for flexible delivery capacity. Documentation & Continuous Improvement Produce architecture diagrams, SOWs, security reports, test results, and audit-ready deliverables. Stay up to date with Microsoft technologies and bring new ideas to evolve our offerings. Qualifications:- Education Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent experience). Required Experience:- Minimum 2+ years leading technical teams and IT projects from planning through executionthis is mandatory . 5+ years in solution architecture or IT consulting, with hands-on Microsoft ecosystem experience. Proven success in Azure (IaaS/PaaS), Intune, SharePoint, MFA, and Office 365 migrations. Experience conducting penetration tests and implementing security/compliance frameworks (HIPAA, CMMC, etc.). Previous experience working in or with an IT consulting or managed services firm. Certifications (Preferred):- Microsoft Certified: Azure Solutions Architect Expert Microsoft 365 Certified: Enterprise Administrator Expert CEH / OSCP / CISSP / CompTIA Security+ Skills:- Strong leadership and team coordination experience. Ability to translate technical requirements into business-focused solutions. Excellent communicationclear, confident, and able to represent the company to executive-level stakeholders. Proficient with project management tools (Jira, MS Project, Asana, etc.). Capable of independently managing project delivery and motivating technical contributors. Nice to Have:- A strong network of IT professionals and contractors for rapid scaling. Experience with SMB clients in regulated industries (healthcare, finance, etc.) Who You Are:- A natural team leader with strong client-facing presence. A technical expert who thrives in a fast-paced, high-responsibility role. A builder , excited about contributing to the foundation of a growing consulting firm. To speed up processing, you might also send a copy of your profile along with a brief write-up supporting your case to :- vinod@apetanco.com, rajni@apetan.com , riya@apetan.com

Posted 1 week ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Pune

Work from Office

Naukri logo

Key Responsibilities Develop and maintain supply chain analytics to monitor operational performance and trends. Lead and participate in Six Sigma and supply chain improvement initiatives. Ensure data integrity and consistency across all analytics and reporting platforms. Design and implement reporting solutions for key supply chain KPIs. Analyze KPIs to identify improvement opportunities and develop actionable insights. Build and maintain repeatable, scalable analytics using business systems and BI tools. Conduct scenario modeling and internal/external benchmarking. Provide financial analysis to support supply chain decisions. Collaborate with global stakeholders to understand requirements and deliver impactful solutions. External Qualifications and Competencies Qualifications Bachelors degree in Engineering, Computer Science, Supply Chain, or a related field. Relevant certifications in BI tools, Agile methodologies, or cloud platforms are a plus. This position may require licensing for compliance with export controls or sanctions regulations. Additional Responsibilities Unique to this Position Experience 8-10 years of total experience, with at least 6 years in a relevant analytics or supply chain role. Proven experience in leading small teams and managing cross-functional projects. Technical Skills Expertise in : SQL, SQL Server, SSIS, SSAS, Power BI. Advanced DAX development for complex reporting needs. Performance optimization for SQL and SSAS environments. Cloud and Data Engineering : Azure Synapse, Azure Data Factory (ADF), Python, Snowflake Agile methodology : Experience working in Agile teams and sprints.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies