Jobs
Interviews

1055 Etl Processes Jobs - Page 29

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You are a highly skilled Microsoft Fabric + Power BI Developer sought for supporting a part-time project engagement. Your expertise in Microsoft Fabric and proficiency in Power BI report development and data modeling will be essential for this remote opportunity suitable for offshore candidates. In this role, you will be responsible for designing and developing robust data solutions using Microsoft Fabric components such as Lakehouse, Data Warehouse, and Real-Time Analytics. Your tasks will include creating insightful and interactive Power BI reports and dashboards, performing data modeling, and writing efficient DAX queries and Power Query (M Language) scripts. Additionally, you will be building and maintaining data pipelines using Fabric Data Factory or Azure Data Factory, collaborating with stakeholders to understand reporting requirements, and delivering actionable insights. It will also be your responsibility to ensure performance optimization and best practices in report development and data architecture. To excel in this position, you must have proven experience with Microsoft Fabric technologies and strong hands-on experience in Power BI development, encompassing Dashboards, Reports, DAX, and Power Query. Your in-depth knowledge of data modeling, ETL processes, and data visualization best practices, along with experience in Fabric Data Factory or Azure Data Factory, will be crucial. Excellent analytical and problem-solving skills, as well as strong communication and collaboration abilities, are also required for this role.,

Posted 2 months ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Senior Associate at PwC's Data, Analytics & Specialist Managed Service tower in India, you will be part of a team of problem solvers dedicated to resolving complex business issues ranging from strategy to execution. Your responsibilities at this management level will include using feedback and reflection to enhance self-awareness, personal strengths, and address development areas. You will need to be flexible in working on stretch opportunities/assignments and demonstrate critical thinking abilities to bring order to unstructured problems. Additionally, you will be responsible for tasks such as Ticket Quality and deliverables review, Status Reporting for the project, and adherence to SLAs, incident management, change management, and problem management practices. In this role, you will be expected to seek and embrace opportunities that provide exposure to different situations, environments, and perspectives. Effective communication skills will be essential as you interact with clients, lead engagements, and collaborate within a team environment. Upholding the firm's code of ethics and business conduct, demonstrating leadership capabilities, and contributing to cross competency work are integral aspects of this position. Key Skills required for this role include expertise in Azure Cloud Engineering. As a candidate, you should possess extensive knowledge and a proven track record in the development of advanced Data warehousing solutions on leading cloud platforms. You should have experience in building scalable and secure data structures and pipelines, designing and implementing ETL processes, and utilizing tools like Informatica, Talend, SSIS, AWS, Azure, Spark, SQL, and Python. Hands-on experience with data analytics tools and technologies like Informatica, Collibra, Hadoop, Spark, Snowflake, and Azure Data Factory is also essential. Your role will involve creating and maintaining data pipelines, data storage solutions, and ETL operations in Azure, as well as collaborating with data scientists and analysts to understand data needs and workflows. Monitoring and troubleshooting data pipelines, ensuring data quality and integrity, and implementing data security measures will be part of your responsibilities. Strong SQL knowledge, experience with relational databases, and familiarity with various databases are required. Additionally, experience in Data Governance solutions, ITIL processes, and strong communication, problem-solving, quantitative, and analytical abilities are necessary for success in this role. Nice to have qualifications include Azure certification. As part of PwC's Managed Services for Data, Analytics & Insights, you will play a crucial role in delivering integrated services and solutions grounded in industry experience and powered by top talent. The platform aims to provide scalable solutions that add value to clients through technology and human-enabled experiences, enabling them to focus on accelerating priorities and achieving sustainable outcomes. By leveraging deep industry insights, world-class talent, and cutting-edge technology, PwC's Managed Services facilitate transformational journeys that drive client success in today's dynamic business environment.,

Posted 2 months ago

Apply

5.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Technical Lead and Senior Developer with 10+ years of experience as a Lead and 5+ years as a Senior Developer, you will be responsible for leveraging your expertise in .NET, Azure, API development, and middleware frameworks to build highly scalable, secure, and robust systems. Your role will involve developing APIs using .NET 8 with a focus on REST, JWT token, CSRF token, and Single Sign-On (SSO), implementing multithreading and asynchronous programming techniques for performance optimization, and building secure middleware frameworks from scratch on the Azure platform. You will work with SQL Server, optimizing relational database programming, and developing efficient SQL queries to support application requirements and reporting needs. Troubleshooting and resolving performance bottlenecks in queries and stored procedures will also be part of your responsibilities. Additionally, you will develop ETL processes to integrate data from various sources into Azure SQL databases. Your expertise in Azure services will be crucial as you design and implement solutions using services such as Function Apps, Container Apps (with Docker), Application Insights, Azure CI/CD pipelines, Git, and DevOps practices. You will centralize error mechanisms and raise notifications using Azure Insights for monitoring and tracing issues across UI, APIs, and other Azure services. Ensuring code quality and testing will be a key aspect of your role, as you will test all code changes to deliver high-quality results with no bugs. You will maintain a strong commitment to writing clean, maintainable, and efficient code while addressing OWASP Top 10 and SANS Top 25 vulnerabilities in API and UI development. Your soft skills, including excellent communication and documentation abilities, strong problem-solving and analytical skills, and the ability to lead teams collaboratively in Agile/Scrum environments will be essential for success in this role. You will also be involved in conducting code reviews to ensure adherence to best practices and frameworks, providing mentorship, guidance, and support to the team. If you are excited about working in a dynamic and rewarding environment, where you can contribute your technical expertise to drive innovation and excellence, this role at Hitachi Solutions India Pvt Ltd in Bangalore may be the perfect fit for you.,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

At PwC, the infrastructure team focuses on designing and implementing secure IT systems that support business operations. The primary goal is to ensure the smooth functioning of networks, servers, and data centers to enhance performance and minimize downtime. In the infrastructure engineering role at PwC, you will be tasked with creating robust and scalable technology infrastructure solutions for clients. This will involve working on network architecture, server management, and cloud computing. As a Data Modeler, we are seeking candidates with a solid background in data modeling, metadata management, and data system optimization. Your responsibilities will include analyzing business requirements, developing long-term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise for this role include: - Analyzing and translating business needs into long-term data model solutions. - Evaluating existing data systems and suggesting enhancements. - Defining rules for translating and transforming data across various models. - Collaborating with the development team to create conceptual data models and data flows. - Establishing best practices for data coding to maintain consistency within the system. - Reviewing modifications of existing systems for cross-compatibility. - Implementing data strategies and developing physical data models. - Updating and optimizing local and metadata models. - Utilizing canonical data modeling techniques to improve data system efficiency. - Evaluating implemented data systems for variances, discrepancies, and efficiency. - Troubleshooting and optimizing data systems for optimal performance. - Demonstrating strong expertise in relational and dimensional modeling (OLTP, OLAP). - Using data modeling tools like Erwin, ER/Studio, Visio, PowerDesigner effectively. - Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). - Understanding of NoSQL databases (MongoDB, Cassandra) and their data structures. - Experience with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). - Familiarity with ETL processes, data integration, and data governance frameworks. - Strong analytical, problem-solving, and communication skills. Qualifications for this position include: - A Bachelor's degree in Engineering or a related field. - 5 to 9 years of experience in data modeling or a related field. - 4+ years of hands-on experience with dimensional and relational data modeling. - Expert knowledge of metadata management and related tools. - Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. - Knowledge of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions (AWS, Azure, GCP). - Knowledge of big data technologies (Hadoop, Spark, Kafka). - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Excellent communication and presentation skills. - Strong interpersonal skills to collaborate effectively with various teams.,

Posted 2 months ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

Our Enterprise Technology division at Macquarie delivers cutting-edge solutions for global operations. We are currently seeking a Vice President who will be responsible for driving strategic direction and operational excellence. The ideal candidate will lead a talented team of engineers, fostering a culture of innovation and collaboration. At Macquarie, we take pride in bringing together diverse individuals and empowering them to explore a wide range of possibilities. As a global financial services group operating in 34 markets with 55 years of unbroken profitability, we offer a supportive and friendly team environment where everyone's ideas contribute to driving outcomes. In this key leadership position, you will have the opportunity to lead and mentor a high-performing team of engineers, cultivating a culture of innovation and continuous improvement. Your responsibilities will include developing and executing the strategic roadmap for enterprise data platforms, ensuring alignment with business objectives and timely project delivery. Collaboration with cross-functional teams to deliver effective data solutions, maintaining technical excellence, and embracing innovative technologies will be essential. The successful candidate should possess: - Extensive experience in data engineering and managing complex data platform projects - Demonstrated leadership skills in managing and developing engineering teams - Proficiency in data architecture, data warehousing, ETL processes, big data technologies (Hadoop, Spark, Kafka), AWS services, Kubernetes, and Docker - Strong analytical and problem-solving abilities for data-driven decision-making - Excellent communication and interpersonal skills for engaging and influencing stakeholders If you are inspired to contribute to building a better future and are excited about the role or working at Macquarie, we encourage you to apply. About Technology: Technology plays a crucial role in every aspect of Macquarie, for our people, customers, and communities. We are a global team that is passionate about accelerating the digital enterprise, connecting people and data, building platforms and applications, and designing tomorrow's technology solutions. Our Commitment to Diversity, Equity, and Inclusion: We are dedicated to providing reasonable adjustments to individuals who may require support during the recruitment process and in their working arrangements. If you need additional assistance, please inform us during the application process.,

Posted 2 months ago

Apply

0.0 - 4.0 years

0 Lacs

kolkata, west bengal

On-site

At PwC, our team in finance consulting specializes in providing consulting services related to financial management and strategy. You will analyze client needs, develop financial solutions, and offer guidance and support to help clients optimize their financial performance, improve decision-making, and achieve their financial goals. As a finance consulting generalist at PwC, you will possess a broad understanding of various aspects of finance consulting. Your work will involve providing comprehensive guidance and support to clients in optimizing their financial performance, improving decision-making, and achieving their financial goals. You will be responsible for analyzing client needs, developing financial solutions, and offering recommendations tailored to specific business requirements. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: - Apply a learning mindset and take ownership for your own development. - Appreciate diverse perspectives, needs, and feelings of others. - Adopt habits to sustain high performance and develop your potential. - Actively listen, ask questions to check understanding, and clearly express ideas. - Seek, reflect, act on, and give feedback. - Gather information from a range of sources to analyze facts and discern patterns. - Commit to understanding how the business works and building commercial awareness. - Learn and apply professional and technical standards, uphold the Firm's code of conduct and independence requirements. We are seeking a highly motivated Data Engineer - Associate to join our dynamic team. The ideal candidate will have a strong foundation in data engineering, particularly with Python and SQL, and will have exposure to cloud technologies and data visualization tools such as Power BI, Tableau, or QuickSight. The Data Engineer will work closely with data architects and business stakeholders to support the design and implementation of data pipelines and analytics solutions. This role offers an opportunity to grow technical expertise in cloud and data solutions, contributing to projects that drive business insights and innovation. Key Responsibilities: Data Engineering: - Develop, optimize, and maintain data pipelines and workflows to ensure efficient data integration from multiple sources. - Use Python and SQL to design and implement scalable data processing solutions. - Ensure data quality and consistency throughout data transformation and storage processes. - Collaborate with data architects and senior engineers to build data solutions that meet business and technical requirements. Cloud Technologies: - Work with cloud platforms (e.g., AWS, Azure, or Google Cloud) to deploy and maintain data solutions. - Support the migration of on-premise data infrastructure to the cloud environment when needed. - Assist in implementing cloud-based data storage solutions, such as data lakes and data warehouses. Data Visualization: - Provide data to business stakeholders for visualizations using tools such as Power BI, Tableau, or QuickSight. - Collaborate with analysts to understand their data needs and optimize data structures for reporting. Collaboration And Support: - Work closely with cross-functional teams, including data scientists and business analysts, to support data-driven decision-making. - Troubleshoot and resolve issues in the data pipeline and ensure timely data delivery. - Document processes, data flows, and infrastructure for team knowledge sharing. Required Skills And Experience: - 0+ years of experience in data engineering, working with Python and SQL. - Exposure to cloud platforms such as AWS, Azure, or Google Cloud is preferred. - Familiarity with data visualization tools (e.g., Power BI, Tableau, QuickSight) is a plus. - Basic understanding of data modeling, ETL processes, and data warehousing concepts. - Strong analytical and problem-solving skills, with attention to detail. Qualifications: - Bachelors degree in Computer Science, Data Science, Information Technology, or related fields. - Basic knowledge of cloud platforms and services is advantageous. - Strong communication skills and the ability to work in a team-oriented environment.,

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer at our organization, your primary responsibility will be to design and implement data warehousing solutions, ETL processes, and big data technologies to support our business needs. Your expertise will ensure maximum usability and efficiency in handling large volumes of data. Success in this position involves effectively gathering and analyzing user requirements to develop robust data solutions. You will be expected to demonstrate strong communication skills to collaborate with cross-functional teams and stakeholders, ensuring that data engineering projects are aligned with business objectives. This role is pivotal in maintaining and optimizing our data infrastructure, allowing us to make data-driven decisions and drive business growth. Your proficiency in SQL, Python, and other relevant tools will be essential in driving innovation and maintaining high data quality standards within the organization. In addition to technical skills, a successful candidate should possess excellent verbal and written communication skills. A solid educational background and relevant experience in data engineering are preferred qualifications for this role. If you are passionate about working with data and thrive in a collaborative environment that values innovation and excellence, we welcome you to apply for the position of Data Engineer with us.,

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Workday Prism Analytics and Reporting Consultant in HR IT at Deutsche Bank, located in Pune, India, you will be focusing on HR Data and the Workday Domain (Prism and Reporting). Your role will involve understanding HR data transformation using Workday Prism, Reporting, and Core Workday HCM modules. You will be responsible for managing technical resources, solution financials, staff development, and ensuring quality deliverables across HR IT projects. In this role, you will be developing a strong understanding of user reporting needs and recommending implementation strategies using Workday tools. You will design, develop, and tune data visualization tools and reports aligned with business requirements. Additionally, you will create and configure metadata objects, collaborate with ETL developers on report design strategies, and recommend innovative reporting solutions based on cost, effectiveness, and data availability. Your responsibilities will also include building prototypes for demonstrations to stakeholders and senior leaders, providing Subject Matter Expert (SME) support for troubleshooting and BI-driven problem-solving, managing security setup/maintenance for data visualization tools, developing project timelines, documentation, and training materials, and offering post-implementation support and process fine-tuning. You will maintain communication with management and users during development cycles and coordinate user activities to ensure data and system integrity. To excel in this role, you should have experience in designing, building, and maintaining data pipelines/transformations in Workday Prism Analytics, translating business requirements into scalable Prism solutions, optimizing Prism workloads for performance and efficiency, and integrating data from diverse sources into Workday Prism with accuracy. You should also have experience in developing ETL processes for reporting and analytics, building reports, dashboards, and analytics using Workday tools, and collaborating with cross-functional teams to address data needs. At Deutsche Bank, you will receive training and development opportunities to help you excel in your career, coaching and support from experts in your team, and a culture of continuous learning to aid progression. Additionally, you will have access to a range of flexible benefits that you can tailor to suit your needs. If you are looking to join a team that values empowerment, responsibility, commercial thinking, initiative, and collaboration, and celebrates the successes of its people, Deutsche Bank welcomes your application. We promote a positive, fair, and inclusive work environment where all individuals are encouraged to thrive and contribute to the success of the Deutsche Bank Group.,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior Data Engineer specializing in Databricks, you will play a critical role within our remote consulting team based in Mexico. Your primary responsibility will be to conceptualize, develop, and enhance large-scale data processing systems utilizing Databricks and cutting-edge data engineering technologies. Your collaborative efforts with data scientists, analysts, and technical teams will be pivotal in delivering robust and scalable data platforms. Your duties will encompass the design, construction, and maintenance of resilient data pipelines tailored for processing both structured and unstructured data. Moreover, you will be tasked with the creation and administration of data lakes and data warehouses optimized for efficient analytics. Your expertise will be crucial in streamlining data workflows to ensure optimal performance, scalability, and cost-effectiveness. In close collaboration with stakeholders, you will elicit data requirements and transform them into scalable solutions that align with the organization's objectives. Implementation of data governance, data quality standards, and security protocols will be essential components of your role. Furthermore, you will be involved in the migration of legacy data processes, such as those from SAS, to contemporary platforms. Key Responsibilities: - Design, develop, and maintain robust data pipelines for processing structured and unstructured data. - Build and manage data lakes and data warehouses optimized for analytics. - Optimize data workflows for enhanced performance, scalability, and cost-efficiency. - Collaborate with stakeholders to gather data requirements and translate them into scalable solutions. - Implement data governance, data quality, and security best practices. - Migrate legacy data processes to modern platforms. - Document architecture, data models, and pipelines. Required Qualifications: - Minimum of 5 years of experience in data engineering or related fields. - At least 3 years of hands-on experience with Databricks. - Proficiency in SQL and hands-on experience with PostgreSQL, MySQL, or NoSQL databases. - Programming skills in Python, Java, or Scala. - Experience with ETL processes, orchestration frameworks, and data pipeline automation. - Familiarity with Spark, Kafka, or similar big data tools. - Experience working on cloud platforms such as AWS, Azure, or GCP. - Prior experience in migrating from SAS would be advantageous. - Excellent communication skills in English. - Must be based in Mexico and eligible to work as a contractor.,

Posted 2 months ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As a SAP BW Data Engineer, you will leverage your 7+ years of hands-on experience in SAP BW and ABAP development to focus on data modeling, ETL processes, and architecture. Your proficiency in SAP BW4/HANA and BW on HANA will be essential in this role. You will be responsible for ABAP programming, including user exits, BADIs, and enhancements. Your in-depth knowledge of SAP BW architecture, data modeling, and ETL processes will be crucial to the success of the projects. You will work with SAP Analytics tools such as SAP Datasphere, troubleshoot complex technical issues, and deliver effective solutions in a timely manner. Your strong communication and collaboration skills will enable you to work effectively in a team environment. Experience with Embedded Analytics and certification in SAP BW, SAP SAC, or related SAP modules will be beneficial. Familiarity with Agile or Scrum methodologies, along with strong analytical and problem-solving skills, will help you deliver business value. Additionally, you will have the opportunity to mentor and guide junior team members.,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

thane, maharashtra

On-site

As the BI/BW Lead at DMart, you will be responsible for leading and managing a dedicated SAP BW team to ensure timely delivery of Reports, Dashboards, and analytics solutions. Your role will focus on managing the team effectively, ensuring that all SAP BW operational support tasks and Development projects are completed with high quality and efficiency. You will also be responsible for the stability and performance of the SAP BW environment, overseeing daily support activities, and ensuring seamless data flow and reporting across the organization. Acting as the bridge between business stakeholders and your technical team, you will play a pivotal role in maintaining and enhancing DMart's data ecosystem. Your educational qualifications should include a Bachelors/Masters Degree in Computer Science, Information Systems, Engineering, or a related field. While SAP BW certifications are preferred, they are not mandatory. Key Responsibilities: - Lead and manage the SAP BW & BOBJ team to ensure efficient workload distribution and timely completion of tasks. - Oversee the daily operational support of the SAP BW & BOBJ environment, ensuring stability and performance. - Provide direction and guidance to the team for issue resolution, data loads, and reporting accuracy. - Act as the primary point of contact for business users and internal teams regarding SAP BW support and enhancements. - Ensure the team follows best practices in monitoring, error handling, and performance optimization. - Drive the continuous improvement of support processes, tools, and methodologies. - Proactively identify potential risks and bottlenecks in data flows and take corrective actions. - Ensure timely delivery of data extracts, reports, and dashboards for business-critical decisions. - Provide leadership in system upgrades, patching, and data model improvements. - Facilitate knowledge sharing and skill development within the SAP BW team. - Maintain high standards of data integrity and security in the BW environment. Professional Skills: - Strong functional and technical understanding of SAP BW/BW on HANA & BOBJ. - At least 5 years of working experience on SAP Analytics. - Solid understanding of ETL processes and data extraction. - Experience working on Data lakes like Snowflake, Big Query, Data bricks, and Dashboard tools like Power BI, Tableau would be an added advantage. - Experience working in Retail, CPG, or SCM would be an added advantage. - Experience in managing SAP BW support activities and coordinating issue resolution. - Strong stakeholder management skills with the ability to translate business needs into technical actions. - Excellent problem-solving and decision-making abilities under pressure.,

Posted 2 months ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

The company Nasdaq Technology is seeking a dedicated Integration Technical Specialist with expertise in building custom integrations in financial systems to join the Bangalore technology center in India. If you are driven by innovation and effectiveness, this opportunity is for you! Nasdaq is at the forefront of market revolution and technology transformation, constantly striving to develop innovative solutions by embracing new technologies. As a senior technical analyst, your role will involve delivering complex technical systems to both new and existing customers and exploring new technologies in the FinTech industry. The Enterprise Solutions team in Bangalore is looking for a Technical Integration Specialist to drive central initiatives across Nasdaqs software products and services portfolio. Candidates who are passionate about delivering top technology solutions to today's markets are encouraged to apply. Key Responsibilities: - Engage in cross-functional work globally to deliver solutions and services to Nasdaqs finance processes - Interact with internal customers to design solutions, build relationships, and establish trust with key stakeholders - Collaborate with colleagues locally and in other countries to deliver sophisticated technology solutions - Receive, analyze, and address technical user inquiries and requests - Provide technical solutions aligned with business requirements and configure applications accordingly - Build and maintain integrations with internal systems and third-party vendors - Conduct end-to-end testing and develop test cases - Participate in various phases of the Software Development Process - Ensure the quality of work by following established processes - Collaborate with multiple IT and business groups Requirements: - 10 to 13 years of experience in integration development - Expertise in Web services such as REST and SOAP API programming - Experience with Informatica Cloud and ETL processes - Strong understanding of AWS services like S3, Lambda, and Glue - Bachelors or Masters degree in computer science or related engineering fields Desired Skills: - Proficiency in Workday Integration tools, Report Writer, and Calculated Fields - Knowledge of finance organization processes including Billing, Accounts Receivable, GL accounting, and Planning & Forecasting - Experience in multinational, multi-geographic companies Nasdaq offers a vibrant and entrepreneurial work environment where individuals are encouraged to take initiative and intelligent risks. The company values work-life balance, well-being, and a culture of connectedness and support. Benefits include an annual monetary bonus, opportunities to become a Nasdaq shareholder, health insurance, flexible working schedules, and various employee programs for growth and development. If this opportunity aligns with your expertise and career goals, submit your application in English at your earliest convenience. Nasdaq will provide updates on the selection process within 2-3 weeks.,

Posted 2 months ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Staff Data Engineer at Trimble, a leading provider of advanced positioning solutions, you will play a key role in designing and implementing robust, scalable, and secure cloud-based data pipelines and architectures in MS Azure. Your expertise in AWS and Azure, along with your strong technical background in cloud platforms, data architecture, and engineering best practices, will be essential in leading our Data and Cloud Engineering team to success. Your responsibilities will include providing technical direction and mentorship to a team of engineers, ensuring best practices in code quality, architecture, and design, and designing secure, scalable, and high-performance cloud infrastructure. You will also be responsible for managing cloud resources, optimizing costs, ensuring high availability and disaster recovery, and automating infrastructure provisioning and deployment processes using tools such as Terraform, CloudFormation, and ARM templates. Collaboration with cross-functional teams to understand data needs and deliver comprehensive cloud solutions will be a key aspect of your role. You will oversee cloud infrastructure management, including monitoring, maintenance, and scaling of cloud resources, and ensure compliance with industry standards and regulatory requirements. Implementing data governance policies and practices to ensure high data quality, integrity, and security across all cloud platforms will also be part of your responsibilities. Staying current with emerging technologies and industry trends, identifying and implementing process improvements to enhance efficiency, quality, and scalability of data engineering and cloud operations, and troubleshooting and resolving complex technical issues related to data pipelines and cloud infrastructure will be crucial in driving innovation in the role of a Staff Data Engineer at Trimble. Your qualifications should include a minimum of 6 years of proven experience as a senior data and cloud engineer or similar role, extensive experience with AWS and Azure cloud platforms, strong understanding of ETL processes, data warehousing, and big data technologies, proficiency in SQL, Python, and other relevant programming languages, and experience with infrastructure as code (IaC) tools such as Terraform, CloudFormation, or ARM templates. Knowledge of containerization and orchestration, cloud cost management and optimization strategies, CI/CD pipelines, and DevOps practices will be beneficial. Excellent leadership, communication, and interpersonal skills, along with the ability to work in a fast-paced, dynamic environment, strong problem-solving and analytical skills, and familiarity with data visualization tools will be valuable assets in this role.,

Posted 2 months ago

Apply

3.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Business Intelligence Developer, your responsibilities will include designing and implementing SSAS Tabular Models to support reporting and analytics needs. You will be developing complex DAX queries to enhance data models and improve performance. Collaboration with stakeholders to gather requirements and translate them into technical specifications will be a key aspect of your role. It is essential to ensure data integrity and accuracy through rigorous testing and validation processes. Additionally, providing ongoing support and maintenance for existing BI solutions and staying updated with the latest BI trends and technologies to continuously improve solutions will be part of your daily tasks. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field. You must have 3-8 years of experience in business intelligence development. Proficiency in SSAS Tabular Model and DAX is required. A strong understanding of data warehousing concepts and ETL processes is crucial for success in this role. Experience with BI tools and reporting solutions will be beneficial. Key Skills: - Reporting solutions - Data warehousing - ETL processes - SSAS - SSAS Tabular Models - Tabular Models - BI tools - Reporting and analytics - DAX,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Analyst 3 in Data Analytics & Business Intelligence at Comcast, your role is vital in transforming data into actionable insights that drive strategic decision-making. You will be responsible for analyzing trends, interpreting data, and identifying opportunities that directly contribute to the success of Comcast. Working collaboratively across functions, you will craft compelling narratives from numerical data that influence key business decisions and shape the direction of the company. Success in this role requires a set of key traits including being a good listener, problem solver, organized, collaborative, perceptive, and analytical. These attributes will enable you to excel in delivering high-quality, actionable insights to stakeholders through your analytical, communication, and critical thinking skills. Your core responsibilities will involve designing, developing, and maintaining complex SQL queries and scripts for report generation. You will work closely with relational databases, particularly Postgres, to collaborate with cross-functional teams in translating reporting requirements into technical solutions. Additionally, you will develop and maintain Tableau dashboards and visualizations to effectively present data insights while ensuring data privacy and security through proper segmentation and access controls. Moreover, you will optimize and troubleshoot existing reports and data pipelines for performance and accuracy, participate in ETL processes, and analyze data trends to provide valuable insights to stakeholders. Documentation of reporting processes, data models, and workflows will be essential for future reference and continuity. To be successful in this role, you should possess 5 to 7 years of experience in report generation and data analysis, strong proficiency in SQL and relational databases, knowledge of data modeling, privacy, and segmentation techniques, as well as hands-on experience with Tableau. Familiarity with ServiceNow and ETL processes is advantageous, along with strong analytical, problem-solving, and communication skills. Preferred qualifications include experience with ServiceNow CMDB, other BI tools like Power BI or Looker, knowledge of scripting languages such as Python or R, and familiarity with cloud platforms like AWS or Azure. Your commitment to understanding Comcast's operating principles, prioritizing customer experience, continuous learning, teamwork, and driving results will be crucial in excelling in this role. Comcast values diversity, inclusion, and integrity in all aspects of its operations and expects its employees to uphold these principles while delivering exceptional service to customers, investors, and communities.,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Join Teleparty, a premium remote entertainment service leading the charge in making social TV a global cultural phenomenon bigger than the movies. With over 10 million streamers who have installed our Netflix focused browser extension - Teleparty, we operate the largest social TV service in the world for millions of streamers, all for free. We are committed to building a workforce that reflects our globally diverse audience & being a technology leader in the new social TV category. We're looking for great people who are creative thinkers, self-motivators and impact-makers with a passion to help us shape the future of TV. About the Role We are seeking an experienced and innovative Data/Product Analysts to join our team. The ideal candidate will play a crucial role in driving product direction through data-driven insights and collaborate closely with various teams to enhance our data pipeline and product offerings. Product Direction - Analyze existing data to uncover trends and opportunities for new product features and improvements - Develop and implement sophisticated analytical models to guide product strategy - Present data-driven recommendations to stakeholders and leadership Data Analysis and Visualization - Utilize advanced SQL skills to query and manipulate complex datasets - Create insightful reports and interactive visualizations using tools like Redash - Develop and maintain dashboards to monitor key performance indicators Data Requirements Elicitation - Serve as a data expert in customer-facing roles, addressing complex data-oriented questions - Translate technical findings into clear, actionable insights for non-technical audiences - Collaborate with customer success teams to identify and resolve data-related issues Data Pipeline Enhancement - Work closely with software development teams to identify opportunities for enriching the existing data pipeline - Propose and implement new data collection methods to support product direction initiatives - Ensure data quality and integrity throughout the pipeline Qualifications - Bachelor's or Master's degree in Data Science, Statistics, Computer Science, or a related field - 5+ years of experience in data analysis/data science/product analysis roles - Proven track record of using data to drive product decisions and strategy - Expert-level proficiency in SQL and experience with data visualization tools (e.g., Redash, Tableau, Power BI) - Strong programming skills in Python or R - Experience in customer-facing roles and ability to communicate complex data concepts to non-technical audiences - Familiarity with data pipeline architectures and ETL processes - Excellent problem-solving and analytical skills - Strong communication and collaboration abilities Preferred Qualifications - Experience in the entertainment and video streaming sectors - Familiarity with machine learning techniques and their applications in product development - Knowledge of big data technologies (e.g., Hadoop, Spark) - Experience with cloud-based data platforms (e.g., AWS, Google Cloud, Azure) If you're passionate about leveraging data to drive product innovation and enjoy working in a collaborative, fast-paced environment, we'd love to hear from you!,

Posted 2 months ago

Apply

5.0 - 8.0 years

5 - 15 Lacs

Pune

Work from Office

Our goal is to reduce complexity and enhance transparency for our internal stakeholders, ensuring a scalable foundation for future growth. Our technology stack includes SAP BW, SAP Datasphere, Databricks, Microsoft Fabric, Microsoft Power Platform, and Power BI. Background and Skills To be successful in this role, you should hold a degree in Finance, Sales & Marketing, IT, Business Administration, Engineering, or a related field that enables you to approach tasks with a structured, analytical, and data-driven mindset. Previous experience with Finance processes, ERP systems (particularly SAP), ERP data models, Microsoft tools, and Power BI combined with strong business acumen and exceptional collaboration skills is essential Role & responsibilities Gather, analyze, and document business requirements clearly to guide Data Engineers in developing solutions aligned with business needs and technological capabilities Evaluate existing data processes and solutions, identifying opportunities for optimization and complexity reduction Engage with stakeholders to gather new requirements and clearly communicate process and data gaps, advocating for improvements and corrective actions Perform data modeling exercises to ensure common data model is achieved across the various source data systems Cleanse and maintain data quality, actively supporting business stakeholders in continuous data quality improvement initiatives Ensure the development and maintenance of high-quality BI logic and reporting standards, delivering scalable, standardized outputs understandable by all stakeholders Desired profile:- Strong analytical mindset with excellent data querying and analytical skills Proactive, solution-oriented, and pragmatic approach to problem-solving. • Curiosity and openness to learning new methods and tools Independent and efficient work habits, demonstrating persistence and accountability. Proactive ownership in driving continuous improvements, considering broader organizational impacts Flexibility and dedication to team success, willing to go above and beyond as necessary. Key Qualifications: 5+ years of experience in data analysis, business intelligence, or a similar role Strong proficiency in data analysis tools such as Excel, SQL, Power BI, or Tableau Extensive knowledge of statistical analysis, data modeling, and data visualization techniques. Hands-on experience with ETL processes Ability to write complex DAX measures, KPIs, and perform advanced calculations in Power BI Demonstrated experience with manufacturing analytic

Posted 2 months ago

Apply

4.0 - 6.0 years

4 - 7 Lacs

Chennai, Tamil Nadu, India

On-site

Key Responsibilities: Develop, enhance, and maintain ETL jobs using IBM DataStage (Parallel and Server Jobs) . Analyze business and technical requirements to design scalable ETL solutions. Integrate data from multiple sources including relational databases, flat files, and cloud platforms. Optimize ETL jobs for performance and reliability. Perform data validation and unit testing of ETL processes. Troubleshoot and resolve ETL job failures and performance issues. Work closely with business analysts, data architects, and QA teams. Ensure adherence to data quality, governance, and security standards. Document ETL processes, data flows, and support procedures. Provide production support and participate in on-call rotations if required. Qualifications and Requirements: Bachelor's degree in Computer Science, Information Systems, or related field. 3+ years of hands-on experience with IBM InfoSphere DataStage (v11.x or higher preferred). Strong SQL skills and familiarity with relational databases like DB2, Oracle, SQL Server , or Teradata . Experience with data warehousing concepts, star/snowflake schemas, and dimensional modeling. Solid understanding of ETL performance tuning and debugging techniques. Familiarity with job scheduling tools (e.g., Autosys, Control-M). Desirable Skills: Knowledge of DataStage Administrator functions and DataStage Director . Experience integrating DataStage with big data platforms or cloud (AWS, Azure, GCP). Exposure to metadata management , data lineage , and data governance tools. Experience with other ETL tools or scripting languages (Shell, Python). IBM DataStage certification is a plus.

Posted 2 months ago

Apply

2.0 - 6.0 years

3 - 10 Lacs

Mumbai, Maharashtra, India

On-site

Key Responsibilities: Develop and maintain ETL workflows and jobs using Talend Open Studio for Data Integration. Extract data from various sources, transform according to business rules, and load into target systems. Work with stakeholders to gather requirements and convert them into technical ETL designs. Optimize Talend jobs for performance, error handling, and reusability. Ensure data quality and integrity through validation and transformation logic. Collaborate with BI, database, and application teams for seamless data flow across systems. Document ETL processes, data mappings, and technical specifications. Monitor daily ETL jobs and troubleshoot issues in production environments. Key Skills Required: Strong hands-on experience with Talend Open Studio (DI/Big Data preferred) Solid understanding of ETL concepts, data modeling, and data warehousing Proficiency in writing complex SQL queries (Oracle, SQL Server, MySQL, etc.) Familiarity with data integration from APIs, flat files, XML/JSON, and cloud sources Basic understanding of Java (used in Talend jobs) Experience in performance tuning and error handling in Talend jobs Exposure to version control systems (Git, SVN) and scheduling tools (Control-M, Cron)

Posted 2 months ago

Apply

4.0 - 6.0 years

8 - 14 Lacs

Mumbai, Maharashtra, India

On-site

We are seeking a skilled Cognos Data Integration Consultant to join our team in India. The ideal candidate will have a strong background in data integration processes, specifically using IBM Cognos Data Integration tools, and will be responsible for designing, developing, and maintaining ETL workflows to support our data management initiatives. Responsibilities Design, develop, and maintain data integration processes using IBM Cognos Data Integration tools. Collaborate with business analysts and stakeholders to gather requirements and translate them into technical specifications. Perform data profiling and data quality assessment to ensure high-quality data integration. Create and manage ETL processes to extract, transform, and load data from various sources into target systems. Optimize and troubleshoot data integration workflows and processes for performance and efficiency. Document data integration processes, technical specifications, and system configurations. Skills and Qualifications 4-6 years of experience in data integration or ETL processes, preferably using IBM Cognos Data Integration tools. Strong knowledge of SQL and experience with relational databases like Oracle, SQL Server, or MySQL. Experience in data modeling and designing data warehouses. Familiarity with data governance and data quality concepts. Strong analytical and problem-solving skills with attention to detail. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.

Posted 2 months ago

Apply

4.0 - 6.0 years

4 - 7 Lacs

Mumbai, Maharashtra, India

On-site

Key Responsibilities: Develop, enhance, and maintain ETL jobs using IBM DataStage (Parallel and Server Jobs) . Analyze business and technical requirements to design scalable ETL solutions. Integrate data from multiple sources including relational databases, flat files, and cloud platforms. Optimize ETL jobs for performance and reliability. Perform data validation and unit testing of ETL processes. Troubleshoot and resolve ETL job failures and performance issues. Work closely with business analysts, data architects, and QA teams. Ensure adherence to data quality, governance, and security standards. Document ETL processes, data flows, and support procedures. Provide production support and participate in on-call rotations if required. Qualifications and Requirements: Bachelor's degree in Computer Science, Information Systems, or related field. 3+ years of hands-on experience with IBM InfoSphere DataStage (v11.x or higher preferred). Strong SQL skills and familiarity with relational databases like DB2, Oracle, SQL Server , or Teradata . Experience with data warehousing concepts, star/snowflake schemas, and dimensional modeling. Solid understanding of ETL performance tuning and debugging techniques. Familiarity with job scheduling tools (e.g., Autosys, Control-M). Desirable Skills: Knowledge of DataStage Administrator functions and DataStage Director . Experience integrating DataStage with big data platforms or cloud (AWS, Azure, GCP). Exposure to metadata management , data lineage , and data governance tools. Experience with other ETL tools or scripting languages (Shell, Python). IBM DataStage certification is a plus.

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are an experienced Oracle Cloud Data Migration Consultant who will be responsible for delivering end-to-end data migration solutions across various Oracle Cloud environments. Your expertise in ETL processes, data transformation, and integration methodologies will be key in ensuring seamless transitions for enterprise systems. In this role, you will develop, test, and support Oracle Cloud data migration processes, including the development of SQL and PL/SQL components for ETL processes, design and execution of data migration strategies, and utilization of Oracle Cloud tools such as FBDI and Cloud-based ETL solutions. Integration using REST and SOAP web services, APIs, and Oracle Cloud-specific communication frameworks will also be part of your responsibilities. You are expected to have at least 3 years of experience in Oracle Cloud data migration projects and possess a strong understanding of Oracle ERP and HCM application processes and technical data dictionaries. Proficiency in PL/SQL, SQL, and ETL processes for Oracle Cloud applications, as well as experience with Oracle FBDI and integration technologies, are required. Additionally, you should have knowledge of data security, governance, and compliance considerations in cloud environments. Your responsibilities will include developing and executing data migration components aligned with functional and technical requirements, conducting unit testing and validation of migrated data, providing support for existing data structures, and collaborating with Oracle Support and stakeholders to resolve technical challenges. You should have excellent communication skills and the ability to work in a fast-paced, multi-tasking environment with periods of high pressure. Furthermore, you should be flexible to support new technologies and evolving data migration approaches. If you are looking to work in a dynamic environment where you can leverage your expertise in Oracle Cloud data migration to deliver innovative solutions and maximize the benefits of Oracle investments for clients, this role is perfect for you. Join us in our mission to deliver superior solutions with lasting value throughout our clients" Oracle journey.,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Technology Lead Analyst position at our organization entails taking charge of establishing and executing new or updated application systems and programs in collaboration with the Technology team. Your main responsibility will be to lead activities related to applications systems analysis and programming. In this role, we are looking for a skilled Database Architect who can contribute to designing, developing, and maintaining our upcoming big data platform. Your key focus will be on shaping our data strategy and ensuring the scalability, performance, and reliability of our data infrastructure. Proficiency in distributed systems, Hadoop, Spark, NoSQL databases, and either Python or Scala is essential for this position. Your responsibilities will include designing and implementing scalable big data solutions, developing and managing data models and ETL processes, optimizing database performance, and ensuring high availability. Collaboration with data engineers, scientists, and analysts to grasp data requirements and offer technical guidance is crucial. Additionally, you will be expected to evaluate new technologies, uphold data security and governance policies, troubleshoot database issues, document database architecture decisions, and mentor junior team members. To qualify for this role, you should hold a Bachelor's degree in Computer Science or a related field, along with a total of 12+ years of experience, including at least 5 years as a Data Architect. Strong knowledge of database design principles, data modeling techniques, and extensive experience with Hadoop, Spark, Kafka, and related technologies are necessary. Proficiency in Python or Scala, experience with NoSQL databases such as Cassandra and MongoDB, as well as excellent communication and collaboration skills are also required. Any additional experience with data warehousing, business intelligence tools, data governance, security best practices, or relevant certifications in technologies like Hadoop and Spark would be considered a bonus. A Master's degree is preferred but not mandatory. Please note that this job description is a general overview of the role's responsibilities, and additional duties may be assigned as needed.,

Posted 2 months ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Data Migration Specialist at Hitachi Energy, you will play a crucial role in developing and executing comprehensive data migration strategies. Your primary responsibilities will include analyzing legacy systems, designing and implementing ETL processes using SAP BusinessObjects Data Services (BODS), and optimizing BODS jobs for performance and reliability. You will utilize your functional expertise in SAP SD, MM, and PP modules to ensure accurate data alignment and drive data quality improvement initiatives to enhance business process efficiency and analytics. To excel in this role, you should possess a Bachelor's degree in Computer Science, IT, or a related field, along with a minimum of 8 years of experience in SAP data migration, including familiarity with SAP S/4HANA. Proficiency in SAP BusinessObjects Data Services (BODS) and data migration tools is essential, as well as strong functional knowledge of SAP SD, MM, and PP modules. Your skills in data analysis, cleansing, and transformation techniques will be critical in conducting data validation, reconciliation, and testing to ensure data integrity. Successful candidates will demonstrate excellent problem-solving, analytical, and communication skills. The ability to work independently and collaboratively in team environments is paramount. While SAP certification and project management experience are considered advantageous, they are not mandatory. Join Hitachi Energy, a global technology leader dedicated to advancing a sustainable energy future for all. Our innovative solutions and services cater to customers in the utility, industry, and infrastructure sectors, driving the digital transformation necessary to accelerate the energy transition towards a carbon-neutral future. With a diverse workforce of around 45,000 employees across 90 countries, we embrace the power of diversity and collaboration to foster great innovation. Apply today to become part of our global team and contribute to shaping a brighter tomorrow.,

Posted 2 months ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

The company is seeking highly motivated and passionate individuals to join the team with tremendous growth potential. If you are a motivated individual with a minimum of 2 years of experience in database systems design, this opportunity might be for you. Your primary responsibilities will include preparing designs for database systems, recommending improvements for performance, developing physical and logical data models, designing ETL processes, creating data models, and performing tests on data. To excel in this role, you should have experience in data modeling with enterprise applications and a good understanding of user requirements, relational databases, JSON data models, data warehouses, star schema, and basic fact table and dimension table techniques. Additionally, hands-on skills in ETL processes, effective troubleshooting, and handling high-volume data loading processes are essential. If you possess good analytical, planning, and implementation skills along with expertise in Ms.SQL Server (SSIS, SSMS, SSAS, SSRS), we encourage you to apply for this position. Please submit your updated resume in MS WORD format to hr.snss@southnests.com. All personal information collected from unsuccessful applicants will be retained for one year for potential future opportunities and then removed. If you wish to withdraw your consent before the specified time-frame, please contact hr.snss@southnests.com. Location: India, Chennai.,

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies