Jobs
Interviews

1055 Etl Processes Jobs - Page 24

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 4.0 years

0 Lacs

hyderabad, telangana

On-site

A career within Financial Markets Business Advisory services will provide you with the opportunity to contribute to a variety of audit, regulatory, valuation, and financial analyses services to design solutions that address clients" complex accounting and financial reporting challenges, as well as their broader business issues. To really stand out and make fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. The PwC Professional, our global leadership development framework, gives us a single set of expectations across our lines, geographies, and career paths. It provides transparency on the skills required as individuals to be successful and progress in our careers, now and in the future. Responsibilities As an Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: - Invite and give in the moment feedback in a constructive manner. - Share and collaborate effectively with others. - Identify and make suggestions for improvements when problems and/or opportunities arise. - Handle, manipulate, and analyze data and information responsibly. - Follow risk management and compliance procedures. - Keep up-to-date with developments in the area of specialism. - Communicate confidently in a clear, concise, and articulate manner - verbally and in the materials produced. - Build and maintain an internal and external network. - Seek opportunities to learn about how PwC works as a global network of firms. - Uphold the firm's code of ethics and business conduct. We are seeking a highly motivated Data Engineer - Associate to join our dynamic team. The ideal candidate will have a strong foundation in data engineering, particularly with Python and SQL, and exposure to cloud technologies and data visualization tools such as Power BI, Tableau, or QuickSight. The Data Engineer will work closely with data architects and business stakeholders to support the design and implementation of data pipelines and analytics solutions. This role offers an opportunity to grow technical expertise in cloud and data solutions, contributing to projects that drive business insights and innovation. Key Responsibilities Data Engineering: - Develop, optimize, and maintain data pipelines and workflows to ensure efficient data integration from multiple sources. - Use Python and SQL to design and implement scalable data processing solutions. - Ensure data quality and consistency throughout data transformation and storage processes. - Collaborate with data architects and senior engineers to build data solutions that meet business and technical requirements. Cloud Technologies - Work with cloud platforms (e.g., AWS, Azure, or Google Cloud) to deploy and maintain data solutions. - Support the migration of on-premise data infrastructure to the cloud environment when needed. - Assist in implementing cloud-based data storage solutions, such as data lakes and data warehouses. Data Visualization - Provide data to business stakeholders for visualizations using tools such as Power BI, Tableau, or QuickSight. - Collaborate with analysts to understand their data needs and optimize data structures for reporting. Collaboration And Support - Work closely with cross-functional teams, including data scientists and business analysts, to support data-driven decision-making. - Troubleshoot and resolve issues in the data pipeline and ensure timely data delivery. - Document processes, data flows, and infrastructure for team knowledge sharing. Required Skills And Experience - 0+ years of experience in data engineering, working with Python and SQL. - Exposure to cloud platforms such as AWS, Azure, or Google Cloud is preferred. - Familiarity with data visualization tools (e.g., Power BI, Tableau, QuickSight) is a plus. - Basic understanding of data modeling, ETL processes, and data warehousing concepts. - Strong analytical and problem-solving skills, with attention to detail. Qualifications - Bachelor's degree in Computer Science, Data Science, Information Technology, or related fields. - Basic knowledge of cloud platforms and services is advantageous. - Strong communication skills and the ability to work in a team-oriented environment.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

The AVP, Business and Customer Analytics(L10) at Synchrony plays a crucial role in delivering high-impact projects by collaborating with various analytics teams to solve key business problems using data-driven solutions. As a part of the India Analytics Hub, you will work on projects that enable the company's growth and profitability through advanced analytics techniques. Your responsibilities will include supporting American Eagle business stakeholders with data-driven insights, leading analytics projects from inception to delivery, deriving actionable recommendations from data insights, and ensuring project timelines and accuracy are met. You will also contribute to internal initiatives, handle multiple projects, and demonstrate strong project management skills. The ideal candidate should hold a degree in Statistics, Mathematics, Economics, Engineering, or a related quantitative field with at least 4 years of hands-on experience in analytics or data science. Proficiency in SQL/SAS programming, Business Intelligence tools like Tableau & Power BI, Google Cloud Platform, ETL processes, and Big Data analytics is required. Experience in campaign sizing, customer targeting, and the financial services industry will be beneficial. Desired skills include working with Python/R, big data technologies like Hadoop/Hive/GCP, and report automation. Effective communication skills, the ability to lead strategic projects independently, and manage competing priorities are essential for this role. The role offers Enhanced Flexibility and Choice in work timings, requiring availability between 06:00 AM - 11:30 AM Eastern Time for meetings with global teams. Internal applicants should ensure they meet the eligibility criteria, inform their manager, update their professional profile, and upload an updated resume in the application process. If you are a motivated individual with a passion for analytics and a desire to drive business growth through data-driven solutions, this role provides an exciting opportunity to make a significant impact within the organization.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Analytics Manager specializing in Power BI, Python, and Tableau within the Insurance Domain, you will play a crucial role in designing, developing, and implementing Power BI dashboards. Your expertise in Power BI, Python, Tableau, and SQL is essential for this role. You will be responsible for leading, mentoring, and developing a team of data analysts and data scientists. Your key responsibilities will include providing strategic direction for analytical projects, defining and implementing the company's data analytics strategy, and conducting complex data analysis to identify trends and patterns. You will oversee the development of interactive dashboards, reports, and visualizations to make data insights easily accessible to stakeholders. Ensuring data integrity and consistency across systems, collaborating with cross-functional teams, and staying current with the latest data analytics trends and technologies are also important aspects of this role. Additionally, you will lead data-driven projects from initiation to execution, managing timelines, resources, and risks effectively. To be successful in this role, you should have a Bachelor's degree in data science, Statistics, Computer Science, Engineering, or a related field, with at least 10 years of experience in data analysis and 2 years in a managerial or leadership position. Proficiency in data analysis and visualization tools such as SQL, Python, R, Tableau, and Power BI is required. Strong knowledge of data modeling, ETL processes, and database management, along with exceptional problem-solving and critical thinking skills, are essential. Effective communication of complex technical concepts to non-technical stakeholders, proven experience in managing and growing a team of data professionals, strong project management skills, and domain knowledge in insurance will be advantageous for this role. If you are looking for a challenging opportunity to lead data analytics projects, collaborate with diverse teams, and drive business insights within the Insurance Domain, this role is ideal for you.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are an experienced Oracle EPM Planning Consultant responsible for supporting the implementation, enhancement, and optimization of Oracle EPM solutions. Your expertise lies in Oracle Planning and Budgeting Cloud Service (PBCS) / Enterprise Planning and Budgeting Cloud Service (EPBCS), financial modeling, forecasting, and integration with other enterprise applications. Strong skills in Hyperion Planning, Essbase, data management, and scripting (Groovy, MDX, SQL) are preferred. Your role involves hands-on configuration, stakeholder collaboration, and troubleshooting for seamless planning and reporting processes. Your responsibilities include leading the design, configuration, and implementation of Oracle EPBCS solutions. You will develop and customize EPBCS modules such as Financials, Workforce, Capital, Projects, and Strategic Modeling. Furthermore, you will collaborate with finance teams to understand planning, budgeting, and forecasting needs, and design and implement driver-based planning, rolling forecasts, and scenario analysis. You will develop and optimize business rules, Groovy scripts, and calculation scripts for dynamic planning processes. Data integration and management tasks include configuring Data Management (DM/FDMEE) for seamless data loads from ERP and other systems, developing ETL processes, and ensuring data accuracy and consistency across financial planning models. Performance optimization and troubleshooting are key aspects of your role, where you will optimize EPBCS models, Essbase cubes, and calculation scripts for improved performance. Additionally, you will conduct training sessions, create user documentation, and provide post-implementation support to ensure smooth user adoption and compliance with best practices. Gathering business requirements, translating them into EPBCS functional designs, and collaborating with finance, IT, and business teams to align EPBCS with enterprise financial processes are also part of your responsibilities. You will manage project deliverables, timelines, and risk mitigation strategies to ensure successful project outcomes. To qualify for this position, you should have a Bachelor's degree in finance, accounting, business, computer science, information systems, or a related field. Additionally, 3 to 6 years of hands-on experience in implementing Oracle EPBCS / PBCS solutions is required. Strong technical skills in configuring EPBCS modules, designing business rules, Groovy scripts, and calculation scripts, as well as proficiency in Smart View, Forms, Task Lists, Data Management (DM/FDMEE), and Essbase cube optimization are necessary. Furthermore, you should possess deep functional and domain knowledge in financial planning, budgeting, forecasting, and variance analysis, along with experience in full-cycle EPBCS implementations and project management methodologies. Strong analytical, communication, stakeholder management, and problem-solving skills are essential for this role. Inoapps focuses on delivering innovative Oracle On-Premises and Oracle Cloud applications and technology solutions to clients. By choosing Inoapps, you will receive support throughout your Oracle journey, working in partnership to deliver superior solutions with lasting value.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

You should have at least 4 years of experience in Power BI and Tableau, specializing in data modeling for reporting and data warehouses. Your expertise should include a thorough understanding of the BI process and excellent communication skills. Your responsibilities will involve Tableau Server Management, including installation, configuration, and administration to maintain consistent availability and performance. You must showcase your ability to design, develop, and optimize data models for large datasets and complex reporting requirements. Strong analytical and debugging skills are essential to identify, analyze, and resolve issues within Power BI reports, SQL code, and data for accurate and efficient performance. Proficiency in DAX and Power Query, along with advanced knowledge of data modeling concepts, is required. Additionally, you should possess strong SQL skills for querying, troubleshooting, and data manipulation. Security implementation is crucial, as you will be responsible for managing user permissions and roles to ensure data security and compliance with organizational policies. A good understanding of ETL processes and in-depth knowledge of Power BI Service, Tableau Server, and Desktop are expected. Your familiarity with workspaces, datasets, dataflows, and security configurations will be beneficial in fulfilling your role effectively.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in financial analysis. You have found the right team. As a Data Domain Architect Lead - Vice President within the Finance Data Mart team, you will be responsible for overseeing the design, implementation, and maintenance of data marts to support our organization's business intelligence and analytics initiatives. You will collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. Leading the development of robust data models to ensure data integrity and consistency, you will oversee the implementation of ETL processes to populate data marts with accurate and timely data. Your role will involve optimizing data mart performance and scalability, ensuring high availability and reliability, while mentoring and guiding a team of data mart developers. Responsibilities: - Lead the design and development of data marts, ensuring alignment with business intelligence and reporting needs. - Collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. - Develop and implement robust data models to support data marts, ensuring data integrity and consistency. - Oversee the implementation of ETL processes to populate data marts with accurate and timely data. - Optimize data mart performance and scalability, ensuring high availability and reliability. - Monitor and troubleshoot data mart issues, providing timely resolutions and improvements. - Document data mart structures, processes, and procedures, ensuring knowledge transfer and continuity. - Mentor and guide a team of data mart developers if needed, fostering a collaborative and innovative work environment. - Stay updated with industry trends and best practices in data warehousing, data modeling, and business intelligence. Required qualifications, capabilities, and skills: - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - Extensive experience in data warehousing, data mart development, and ETL processes. - Strong expertise in Data Lake, data modeling, and database management systems (e.g., Databricks, Snowflake, Oracle, SQL Server, etc.). - Leadership experience, with the ability to manage and mentor a team. - Excellent problem-solving skills and attention to detail. - Strong communication and interpersonal skills to work effectively with cross-functional teams. Preferred qualifications, capabilities, and skills: - Experience with cloud-based data solutions (e.g., AWS, Azure, Google Cloud). - Familiarity with advanced data modeling techniques and tools. - Knowledge of data governance, data security, and compliance practices. - Experience with business intelligence tools (e.g., Tableau, Power BI, etc.). Candidates must be able to physically work in our Bengaluru Office in the evening shift from 2 PM to 11 PM IST. The specific schedule will be determined and communicated by direct management.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Associate Manager, Salesforce Data Cloud will play a crucial role in leveraging Salesforce Data Cloud to transform how the organization uses customer data. This position, located in Hyderabad, within the Data Cloud Business Enablement Team, focuses on building, managing, and optimizing the data unification strategy to empower business intelligence, marketing automation, and customer experience initiatives. Key Responsibilities: - Managing data models within Salesforce Data Cloud to ensure optimal data harmonization across multiple sources. - Maintaining data streams from various platforms into Data Cloud, such as CRM, SFMC, MCP, Snowflake, and third-party applications. - Developing and optimizing SQL queries to transform raw data into actionable insights. - Building and maintaining data tables, calculated insights, and segments for organization-wide use. - Collaborating with marketing teams to translate business requirements into effective data solutions. - Monitoring data quality and implementing processes to ensure accuracy and reliability. - Creating documentation for data models, processes, and best practices. - Providing training and support to business users on leveraging Data Cloud capabilities. Essential Requirements: - Bachelor's degree in Computer Science, Information Systems, or related field. - Salesforce Data Cloud certification preferred. - 3+ years of experience working with Salesforce platforms. - Previous work with Customer Data Platforms (CDPs). - Experience with Tableau CRM or other visualization tools. - Background in marketing technology or customer experience initiatives. - Salesforce Administrator or Developer certification. - Familiarity with Agile ways of working, Jira, and Confluence. Desired Requirements: - Advanced knowledge of Salesforce Data Cloud architecture and capabilities. - Strong SQL skills for data transformation and query optimization. - Experience with ETL processes and data integration patterns. - Understanding of data modeling principles and best practices. - Experience with Salesforce Marketing Cloud, MCI & MCP. - Familiarity with APIs and data integration techniques. - Knowledge of data privacy regulations and compliance requirements (GDPR, CCPA, etc.). - Demonstrated experience with data analysis and business intelligence tools. - Strong problem-solving abilities and analytical thinking. - Excellent communication skills to translate technical concepts to business users. - Ability to work collaboratively in cross-functional teams. - Familiarity with and adaptability to new-generation technologies and trends (Gen AI and Agentic AI) is an added advantage. Commitment To Diversity And Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams representative of the patients and communities served. Accessibility And Accommodation: Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If you need a reasonable accommodation due to a medical condition or disability, please email diversityandincl.india@novartis.com with details. Join our Novartis Network: If this role isn't right for you, consider signing up for our talent community to stay connected and learn about suitable career opportunities as soon as they arise. Benefits and Rewards: Read our handbook to discover how Novartis supports personal and professional growth for its employees.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Manager, Software Engineering - Data Engineering and Data Science As a Manager of Software Engineering with a focus on Data Engineering and Data Science, you will be instrumental in shaping and expanding our Authentication Program. Your primary role will involve ensuring the integrity and excellence of our data and enabling our teams to operate efficiently. Leading a team of skilled engineers, you will drive innovation, oversee the successful execution of impactful projects, and contribute significantly to our organization's growth and success. Your responsibilities will include: - Leading the development and deployment of scalable data engineering and data science solutions to support the Authentication Program. - Ensuring the accuracy, quality, and reliability of data across all systems and processes. - Mentoring and providing technical guidance to a team of engineers and data scientists. - Making strategic decisions and delivering innovative solutions in collaboration with cross-functional teams. - Collaborating with product stakeholders to prioritize initiatives and align them with business objectives. - Establishing and managing data pipelines to ensure efficient and accurate data processing. - Implementing and advocating best practices in data engineering, data science, and software development. - Automating and streamlining data processing workflows and development processes. - Conducting Proof of Concepts (POCs) to assess and introduce new technologies. - Participating in Agile ceremonies, contributing to team prioritization and planning. - Developing and presenting roadmaps and proposals to Senior Management and stakeholders. - Cultivating a culture of continuous improvement and excellence within the team. Qualifications: Technical Expertise: - Proficiency in Data Engineering, Data Science, or related areas. - Competence in programming languages like Python, Java, or Scala. - Hands-on experience with data processing frameworks. - Knowledge of data warehousing solutions. - Proficiency in data modeling, ETL processes, and data pipeline orchestration. - Familiarity with machine learning frameworks and libraries. - Understanding of secure coding practices and data privacy regulations. Leadership and Communication: - Demonstrated leadership in managing technical teams. - Strong problem-solving and decision-making abilities. - Excellent written and verbal communication skills. - Effective collaboration with cross-functional teams and stakeholders. - Experience in Agile methodologies and project management. Preferred Qualifications: - Degree in Computer Science, Data Science, Engineering, or related fields. - Experience with streaming data platforms. - Familiarity with data visualization tools. - Experience with CI/CD pipelines and DevOps practices. If you require accommodations or assistance during the application process or recruitment in the US or Canada, please contact reasonable_accommodation@mastercard.com. Join us in a culture that values innovation, collaboration, and excellence. Apply now to contribute to shaping the future of our Authentication Program and maintaining the highest quality of our data.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

nagpur, maharashtra

On-site

As a MicroStrategy Developer at our company located in Nagpur/Pune, you will be responsible for collaborating with clients and stakeholders to gather and understand business requirements. Your role involves translating these requirements into technical specifications for MicroStrategy BI solutions. You will design and develop MicroStrategy reports, dashboards, and interactive visualizations, utilizing MicroStrategy features to create efficient and user-friendly BI solutions. Data modeling will be a key aspect of your responsibilities, where you will define and implement data models supporting reporting and analytics needs while ensuring data accuracy, integrity, and optimal performance within MicroStrategy. Your expertise will be crucial in optimizing MicroStrategy reports and queries for improved performance, identifying and implementing best practices to enhance system efficiency. Client collaboration is essential, where you will demonstrate MicroStrategy capabilities, gather feedback, and provide training and support to end-users for effective utilization of MicroStrategy solutions. Integration of MicroStrategy with various data sources and third-party applications, ensuring seamless data flow between systems, will also be part of your role. Security implementation within the MicroStrategy environment, including designing and implementing security models, defining user roles, access controls, and data security measures, will be under your purview. Maintaining documentation for MicroStrategy solutions, configurations, and best practices for knowledge transfer and future reference is also vital. Staying updated on the latest MicroStrategy features and updates, evaluating and recommending new technologies to enhance BI capabilities, are expected from you to excel in this role. To qualify for this position, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with proven experience as a MicroStrategy Consultant specializing in MicroStrategy architecture and development. A strong understanding of BI concepts, data modeling, data warehousing, proficiency in SQL for writing complex queries, excellent problem-solving, and analytical skills are required. Strong communication and interpersonal skills for client interactions are also essential. Preferred skills include MicroStrategy certification, experience with other BI tools like Tableau, Power BI, or QlikView, knowledge of data visualization best practices, and familiarity with ETL processes and tools. Additionally, possessing one of the following certifications will be advantageous: MicroStrategy Certified Master Analyst (MCMA) Certification, MicroStrategy Certified Specialist Developer (MCSD) Certification, MicroStrategy Certified Master Developer (MCSD) Certification, or MicroStrategy Certified Developer (MCD) Certification. If you are a dynamic individual with energy and passion for your work, we invite you to join our innovation-driven organization. We offer high-impact careers and growth opportunities across global locations, fostering a collaborative work environment for learning and development. NICE provides targeted learning and development programs, generous benefits, and perks to help you thrive, learn, and grow in your career.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and organizational standards, facilitating smooth data integration and accessibility across different systems. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that enhance decision-making processes within the organization. Expected to be an SME, you will collaborate and manage the team to perform effectively, making responsible decisions and contributing to key decisions across multiple teams. Providing solutions to problems for your immediate team and other teams, you will facilitate training sessions and workshops to enhance team capabilities. Continuously evaluating and improving data modeling processes to ensure efficiency will be a key part of your role. In terms of professional and technical skills, proficiency in Data Building Tool is a must-have, along with a strong understanding of data modeling techniques and methodologies. Experience with data integration and ETL processes, familiarity with database management systems and SQL, as well as the ability to translate business requirements into technical specifications are essential for success in this position. The candidate should have a minimum of 7.5 years of experience in Data Building Tool. The position is based in Mumbai and requires a 15 years full-time education.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Salesforce Data Cloud Analyst will be instrumental in leveraging Salesforce Data Cloud for revolutionizing the utilization of customer data within the organization. This role is a part of the Data Cloud Business Enablement Team and is dedicated to constructing, overseeing, and enhancing the data unification strategy to drive business intelligence, marketing automation, and customer experience initiatives. You will be responsible for managing data models within Salesforce Data Cloud to ensure seamless data harmonization across diverse sources. Additionally, you will maintain data streams from different platforms into Data Cloud, such as CRM, SFMC, MCP, Snowflake, and third-party applications. Developing and refining SQL queries to convert raw data into actionable insights, as well as creating and managing data tables, calculated insights, and segments for organizational use, are integral aspects of this role. Collaboration with marketing teams to interpret business requirements into effective data solutions, monitoring data quality to uphold accuracy and reliability, and providing training and assistance to business users on leveraging Data Cloud capabilities are key responsibilities you will undertake. Furthermore, creating documentation for data models, processes, and best practices will be part of your routine tasks. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Systems, or a related field, along with 3+ years of experience working with Salesforce platforms. Possessing Salesforce Data Cloud certification and prior exposure to Customer Data Platforms (CDPs) are preferred. Proficiency in Tableau CRM or other visualization tools, a background in marketing technology or customer experience initiatives, and Salesforce Administrator or Developer certification are essential requirements for this position. Desired qualifications include advanced knowledge of Salesforce Data Cloud architecture, strong SQL skills, experience with ETL processes and data integration patterns, and an understanding of data modeling principles and best practices. Familiarity with Salesforce Marketing Cloud, MCI & MCP, APIs and data integration techniques, as well as knowledge of data privacy regulations and compliance requirements (GDPR, CCPA, etc.) are also advantageous. Demonstrated experience in data analysis and business intelligence tools, problem-solving abilities, and excellent communication skills will be beneficial for excelling in this role. Moreover, adaptability to new-generation technologies and trends, such as Gen AI and Agentic AI, is considered an added advantage. Novartis is dedicated to fostering an inclusive work environment and diverse teams that reflect the patients and communities served. The company is committed to collaborating with individuals with disabilities and providing reasonable accommodations during the recruitment process. If you require an accommodation due to a medical condition or disability, please contact diversityandincl.india@novartis.com with your request details and contact information. Remember to include the job requisition number in your message. Novartis offers a rewarding work environment where you can make a difference in the lives of people with diseases and their families. By joining Novartis, you will be part of a community of dedicated individuals working together to achieve breakthroughs that positively impact patients" lives. If you are ready to contribute to creating a brighter future, visit https://www.novartis.com/about/strategy/people-and-culture to learn more about opportunities at Novartis. If this Novartis role is not suitable for you, you can join the Novartis talent community to stay informed about relevant career opportunities as they arise by signing up at https://talentnetwork.novartis.com/network. Additionally, you can explore Novartis" handbook to discover the various ways the company supports personal and professional growth at https://www.novartis.com/careers/benefits-rewards.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

We are seeking a Data Migration Specialist with a keen attention to detail and a customer-oriented approach. Your role will involve assisting customers in understanding best practices for migrating their data, collaborating with them to map various entities, managing migration projects from start to finish, and executing multiple data migration deliverables. You will become an expert in data migration tools, document processes, cleanse unstructured data, automate solutions, and migrate customer data across multiple products. Strong communication skills, analytical thinking, and familiarity with scripting languages like Python or Ruby are essential. You should have a basic understanding of data types and structures, experience in managing data migration projects with large datasets, debugging data elements, and working with ETL processes. Experience with REST APIs, Web Services, and AWS is a plus. The ability to work independently in complex environments and communicate technical concepts effectively is crucial for this role.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

The ideal candidate should have experience in designing and implementing business process workflows using Collibra Workflow Designer. You should have a good understanding of Collibra Data Governance Center (DGC) and its modules, including Data Catalog, Business Glossary, and Policy Manager. Your expertise should include metadata harvesting, lineage tracking, and governance to enhance data visibility. Proficiency in using Collibra REST APIs for workflow automation, data exchange, and custom integrations with other tools is essential. It is important to be familiar with Collibra Data Quality & Observability, including setting up data quality rules and configuring DQ workflows. Knowledge of Groovy & Java for developing custom workflows and scripts within Collibra is required. You should be able to write Python & SQL for data validation, integration scripts, and automation. Understanding of ETL processes and integrating Collibra with cloud/on-prem databases is a plus. Familiarity with data governance frameworks such as GDPR, CCPA, HIPAA, and best practices is highly desirable. Experience in effectively managing technical and business metadata is important. You should have the ability to track data lineage and assess downstream/upstream data impacts.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in financial analysis. You have found the right team. As a Data Domain Architect Lead - Vice President within the Finance Data Mart team, you will be responsible for overseeing the design, implementation, and maintenance of data marts to support our organization's business intelligence and analytics initiatives. You will collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. You will lead the development of robust data models to ensure data integrity and consistency, and oversee the implementation of ETL processes to populate data marts with accurate and timely data. You will optimize data mart performance and scalability, ensuring high availability and reliability, while mentoring and guiding a team of data mart developers. Lead the design and development of data marts, ensuring alignment with business intelligence and reporting needs. Collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. Develop and implement robust data models to support data marts, ensuring data integrity and consistency. Oversee the implementation of ETL (Extract, Transform, Load) processes to populate data marts with accurate and timely data. Optimize data mart performance and scalability, ensuring high availability and reliability. Monitor and troubleshoot data mart issues, providing timely resolutions and improvements. Document data mart structures, processes, and procedures, ensuring knowledge transfer and continuity. Mentor and guide a team of data mart developers if needed, fostering a collaborative and innovative work environment. Stay updated with industry trends and best practices in data warehousing, data modeling, and business intelligence. Required qualifications, capabilities, and skills: - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - Extensive experience in data warehousing, data mart development, and ETL processes. - Strong expertise in Data Lake, data modeling and database management systems (e.g., Databricks, Snowflake, Oracle, SQL Server, etc.). - Leadership experience, with the ability to manage and mentor a team. - Excellent problem-solving skills and attention to detail. - Strong communication and interpersonal skills to work effectively with cross-functional teams. Preferred qualifications, capabilities, and skills: - Experience with cloud-based data solutions (e.g., AWS, Azure, Google Cloud). - Familiarity with advanced data modeling techniques and tools. Knowledge of data governance, data security, and compliance practices. - Experience with business intelligence tools (e.g., Tableau, Power BI, etc.). Candidates must be able to physically work in our Bengaluru Office in evening shift - 2 PM to 11PM IST. The specific schedule will be determined and communicated by direct management.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Thoucentric, the Consulting arm of Xoriant, a renowned digital engineering services company with 5000 employees, is looking for a skilled Integration Consultant with 5 to 6 years of experience to join their team. As a part of the Consulting business of Xoriant, you will be involved in Business Consulting, Program & Project Management, Digital Transformation, Product Management, Process & Technology Solutioning, and Execution across various functional areas such as Supply Chain, Finance & HR, Sales & Distribution in the US, UK, Singapore, and Australia. Your role will involve designing, building, and maintaining data pipelines and ETL workflows using tools like AWS Glue, CloudWatch, PySpark, APIs, SQL, and Python. You will be responsible for creating and optimizing scalable data pipelines, developing ETL workflows, analyzing and processing data, monitoring pipeline health, integrating APIs, and collaborating with cross-functional teams to provide effective solutions. **Key Responsibilities** - **Pipeline Creation and Maintenance:** Design, develop, and deploy scalable data pipelines ensuring data accuracy and integrity. - **ETL Development:** Create ETL workflows using AWS Glue and PySpark adhering to data governance and security standards. - **Data Analysis and Processing:** Write efficient SQL queries and develop Python scripts for data tasks automation. - **Monitoring and Troubleshooting:** Utilize AWS CloudWatch to monitor pipeline performance and resolve issues promptly. - **API Integration:** Integrate and manage APIs for connecting external data sources and services. - **Collaboration:** Work closely with cross-functional teams to understand data requirements and communicate effectively with stakeholders. **Required Skills and Qualifications** - **Experience:** 5-6 Years - **o9 solutions platform exp is Mandatory** - Strong expertise in AWS Glue, CloudWatch, PySpark, Python, and SQL. - Hands-on experience in API integration, ETL processes, and pipeline creation. - Strong analytical and problem-solving skills. - Familiarity with data security and governance best practices. **Preferred Skills** - Knowledge of other AWS services such as S3, EC2, Lambda, or Redshift. - Experience with Pyspark, API, SQL Optimization, Python. - Exposure to data visualization tools or frameworks. **Education:** - Bachelors degree in computer science, Information Technology, or a related field. In this role at Thoucentric, you will have the opportunity to define your career path, work in a dynamic consulting environment, collaborate with Fortune 500 companies and startups, and be part of a supportive working environment that encourages personal development. Join us in the exciting growth story of Thoucentric in Bangalore, India. (Posting Date: 05/22/2025),

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for designing and implementing AI and GenAI solutions on Google Cloud Platform (GCP). Your role will involve creating architecture diagrams, developing machine learning models, and optimizing existing solutions for performance and scalability. You will also be required to manage large datasets, stay updated on AI trends and GCP services, and provide technical guidance to engineering teams. Additionally, you will engage in discussions with clients to understand their requirements and propose suitable solutions. To qualify for this role, you should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with proven experience as a Solution Architect in AI/GenAI. Proficiency in Python, strong knowledge of GCP services, familiarity with Generative AI models, and experience with data technologies and machine learning frameworks are essential. Excellent problem-solving and communication skills are also required. Preferred skills for this position include certification in Google Cloud Professional Data Engineer or Professional Cloud Architect, familiarity with DevOps practices and tools, and experience with large-scale data processing.,

Posted 1 month ago

Apply

8.0 - 15.0 years

0 Lacs

pune, maharashtra

On-site

You are an experienced Business Analyst with a specialization in Data Warehousing, and you have the opportunity to join our team. Your role will involve gathering and analyzing business requirements, converting them into technical specifications, and working closely with various teams to implement data-driven solutions. Ideally, you should have a background in the pharmaceutical or life sciences sector. Your responsibilities will include utilizing your expertise in Data Warehousing concepts, ETL processes, and reporting tools. You should have a track record of effectively gathering requirements, creating documentation, and managing stakeholders. Furthermore, your ability to analyze intricate datasets and derive actionable insights will be crucial in this role. While not mandatory, experience in the pharmaceutical or healthcare domain would be considered a valuable asset for this position.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

vellore, tamil nadu

On-site

You are an experienced PowerBuilder Developer with strong Sybase expertise, tasked with leading the migration of legacy PowerBuilder applications and databases to modern platforms. Your responsibilities include analyzing, refactoring, and migrating existing systems to ensure smooth transitions with minimal downtime, improved performance, and scalability. You will be analyzing existing PowerBuilder applications and Sybase databases to plan migration strategies. Refactoring and optimizing legacy PowerBuilder code to support migration goals is a key part of your role. Additionally, you will migrate databases from Sybase ASE or SQL Anywhere to new platforms or upgraded Sybase environments. Developing and maintaining migration tools, scripts, and utilities to automate data and application migration is also essential. Collaboration with cross-functional teams to understand business requirements and ensure minimal disruption during migration is crucial. You will be responsible for testing migrated applications for functionality, performance, and security. Troubleshooting and resolving migration-related issues, including data integrity and compatibility, will be part of your responsibilities. Documenting migration processes, risks, and solutions is also expected from you. Post-migration support and optimization to ensure stable system operation will be required. To excel in this role, you should have a Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. A minimum of 4 years of experience with PowerBuilder development and Sybase database management is necessary. Extensive experience in PowerBuilder application development and Sybase database management, specializing in migration projects, is crucial. Expertise in SQL, stored procedures, and database optimization in Sybase ASE or SQL Anywhere is required. Proficiency in analyzing legacy code, refactoring, and modernizing applications is a must. Familiarity with database migration tools and ETL processes is beneficial. A good understanding of data integrity, security, and compliance during migrations is essential. Strong problem-solving and communication skills are also key to success in this role.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

coimbatore, tamil nadu

On-site

As a Data Architect, you will play a crucial role in designing, implementing, and maintaining scalable data architecture solutions aligned with business objectives. You will collaborate with cross-functional teams to translate business requirements into data models and architecture, ensuring data quality, integrity, security, and compliance across all systems. Your expertise in data modeling, ETL processes, and big data solutions will be essential in integrating disparate data sources for seamless data flow and access. Your responsibilities will include developing and maintaining data pipelines, creating complex data visualizations, and recommending tools to enhance the overall data strategy. You must possess a Bachelor's degree in Computer Science or a related field, along with relevant IT certifications in data management or software architecture. With a minimum of 5 years of hands-on experience in software design and data architecture projects, you should have a background in roles such as Database Administrator, Data Analyst, or Data Engineer. Proficiency in data modeling techniques, ETL tools, programming languages like Python or Java, and business intelligence tools such as Tableau and Power BI is required. Familiarity with big data technologies, Agile methodologies, and data security principles is essential. Your strong analytical and problem-solving abilities, along with excellent communication and stakeholder management skills, will be key to your success in this role. You should be ready to work onsite in Bengaluru for an initial 3-month engagement, with the possibility of a 6-month contract extension. Your ability to work in a day shift from Monday to Friday, along with morning shifts, will be crucial. If you possess the required qualifications, experience, and skills mentioned above and are eager to take on this challenging role, we look forward to receiving your application.,

Posted 1 month ago

Apply

1.0 - 5.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have at least 3 years of hands-on experience in data modeling, ETL processes, developing reporting systems, and data engineering using tools such as ETL, Big Query, SQL, Python, or Alteryx. Additionally, you should possess advanced knowledge in SQL programming and database management. Moreover, you must have a minimum of 3 years of solid experience working with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau, along with a good understanding of data warehousing concepts and best practices. Excellent problem-solving and analytical skills are essential for this role, as well as being detail-oriented with strong communication and collaboration skills. The ability to work both independently and as part of a team is crucial for success in this position. Preferred skills include experience with GCP cloud services such as BigQuery, Cloud Composer, Dataflow, CloudSQL, Looker, Looker ML, Data Studio, and GCP QlikSense. Strong SQL skills and proficiency in various BI/Reporting tools to build self-serve reports, analytic dashboards, and ad-hoc packages leveraging enterprise data warehouses are also desired. Moreover, having at least 1 year of experience in Python and Hive/Spark/Scala/JavaScript is preferred. Additionally, you should have a solid understanding of consuming data models, developing SQL, addressing data quality issues, proposing BI solution architecture, articulating best practices in end-user visualizations, and development delivery experience. Furthermore, it is important to have a good grasp of BI tools, architectures, and visualization solutions, coupled with an inquisitive and proactive approach to learning new tools and techniques. Strong oral, written, and interpersonal communication skills are necessary, and you should be comfortable working in a dynamic environment where problems are not always well-defined.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

As a Junior Data Analyst at our company located in Hyderabad, you will play a crucial role in our Data & Analytics team. We are looking for a motivated and intellectually curious individual with a solid foundation in data analysis, business intelligence, and critical thinking. Your responsibilities will include interpreting data, generating insights, and supporting strategic initiatives across various business units. You will collaborate with internal stakeholders to understand business requirements, work with large datasets, build dashboards, and deliver actionable insights that drive informed business decisions. Your key responsibilities will involve data extraction & transformation, reporting & dashboarding, data analysis, stakeholder collaboration, data quality & governance, as well as communication & documentation. To excel in this role, you must possess a Bachelor's degree in Computer Science, Mathematics, Statistics, Economics, Engineering, or a related field. You should have 2-3 years of hands-on experience in data analytics or business intelligence roles. Strong analytical thinking, proficiency in SQL, experience with ETL processes, and knowledge of Excel and data visualization tools like Tableau or Power BI are essential. Excellent communication skills, attention to detail, and the ability to manage multiple priorities and deadlines are also required. Preferred or bonus skills include exposure to scripting languages like Python or R, experience with cloud platforms and tools (e.g., AWS, Snowflake, Google BigQuery), prior experience in the financial services or fintech domain, and an understanding of data modeling and warehousing concepts. In return, we offer a collaborative, inclusive, and intellectually stimulating work environment, opportunities for learning and growth through hands-on projects and mentorship, and the chance to work with data that drives real business impact.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You will be joining Birlasoft as a Genio OpenText ETL Developer. Birlasoft is a global leader in Cloud, AI, and Digital technologies, known for its domain expertise and enterprise solutions. As part of the CKA Birla Group, with over 12,000 professionals, Birlasoft is committed to building sustainable communities and empowering societies worldwide. As a Genio OpenText ETL Developer, you will play a crucial role in designing, developing, and maintaining ETL workflows to support data integration and migration projects. Your responsibilities will include designing, developing, and maintaining ETL processes using OpenText Genio. You will collaborate with business analysts and data architects to understand data requirements and translate them into technical specifications. Your role will involve implementing data extraction, transformation, and loading processes to integrate data from various sources. You will optimize ETL workflows for performance and scalability, perform data quality checks, ensure data integrity throughout the ETL process, and troubleshoot and resolve ETL-related issues. Additionally, you will document ETL processes, maintain technical documentation, and provide support and guidance to junior ETL developers. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field. You must have proven experience as an ETL Developer, focusing on OpenText Genio, and possess strong knowledge of ETL concepts, data integration, and data warehousing. Proficiency in SQL, experience with database management systems, familiarity with data modeling and data mapping techniques, excellent problem-solving skills, attention to detail, and strong communication and teamwork abilities are essential for this position. Preferred qualifications include experience with other ETL tools and technologies and knowledge of Agile development methodologies.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Lead Data Engineer specializing in Data Governance with expertise in Azure Purview, your key responsibilities will include designing and implementing data governance policies, standards, and procedures to guarantee data quality, consistency, and security. You will be tasked with identifying, analyzing, and resolving data quality issues across various data sources and platforms. Collaboration with cross-functional teams to enforce data governance structures and ensure adherence to policies and standards will be a crucial aspect of your role. Your role will also involve implementing and maintaining monitoring systems to track data quality, compliance, and security. Proficiency in data modelling, data warehousing, ETL processes, and data quality tools is essential. Familiarity with data governance tools like Azure Purview will be beneficial in executing your duties effectively. Ensuring that data is safeguarded and complies with privacy regulations through the implementation of appropriate access controls and security measures will be a top priority. You will also be responsible for facilitating data stewardship activities and providing guidance to data stewards on best practices in data governance. Leveraging Azure OneLake and Azure Synapse Analytics, you will design and implement scalable data storage and analytics solutions that support big data processing and analysis. Your expertise in these areas will be instrumental in meeting the data processing and analysis requirements of the organization.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

jalna, maharashtra

On-site

As a SQL Developer at Staff4Me, you will play a pivotal role in the development, maintenance, and optimization of SQL databases and queries. Your primary responsibility will be to ensure efficient and reliable data storage and retrieval by collaborating with our cross-functional team. Your key responsibilities will include gathering and analyzing database requirements in collaboration with cross-functional teams. You will design, develop, and maintain SQL databases while writing and optimizing SQL queries for data retrieval and manipulation. Additionally, you will be tasked with debugging and resolving database performance and integrity issues and creating scripts for database backup, restore, and migration. It will be essential for you to implement and enforce database security best practices, conduct database performance tuning and optimization, and maintain database documentation and data dictionaries. Staying updated with the latest database technologies, trends, and best practices will also be part of your role. To be successful in this role, you should have at least 2 years of experience as a SQL Developer or in a similar role, along with a Bachelor's degree in Computer Science, Information Technology, or a related field. Strong knowledge of SQL and relational database concepts is crucial, as well as experience with database management systems like MySQL, PostgreSQL, or Oracle. Proficiency in writing complex SQL queries and stored procedures, understanding of database performance tuning and optimization techniques, familiarity with database security and backup protocols, and knowledge of data integration and ETL processes are also required. Excellent problem-solving and analytical skills, along with a strong attention to detail and organizational skills, will be beneficial in this position.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 - 0 Lacs

karnataka

On-site

As a Power BI Developer, you will be an integral part of our dynamic team, contributing your expertise to design, develop, and implement advanced Power BI solutions that facilitate data-driven decision-making. Your role will involve close collaboration with business stakeholders to grasp their requirements and translate them into visually appealing and high-performance Power BI reports. Your responsibilities will span various key areas, including data modeling and analysis. You will be tasked with creating robust data models, utilizing advanced DAX for complex calculations, and effectively transforming and cleaning data using Power Query. Additionally, you will develop interactive Power BI reports with diverse visualizations, optimize report performance, and enforce data access control through Row-Level Security (RLS). Furthermore, you will oversee Power BI Service administration, managing capacities, licenses, and deployment strategies while integrating Power BI with other Microsoft tools for enhanced automation and data processing. Your expertise in cloud platforms like MS Fabric, Data Factory, and Data Lake will be crucial in optimizing data pipelines and scalability. In addition to your technical responsibilities, you will engage in collaboration with stakeholders to deliver actionable insights, mentor junior team members on best practices, and provide technical leadership by ensuring adherence to standards and deploying reports to production environments. To qualify for this role, you should possess 2 to 6 years of hands-on experience with Power BI and related technologies, demonstrating proficiency in data modeling, DAX, Power Query, visualization techniques, and SQL skills. Experience in ETL processes, cloud platforms, and strong problem-solving abilities are essential. Excellent communication skills and the ability to work both independently and collaboratively are also required. Preferred qualifications include experience with R or Python for custom visual development and certification in Power BI or related technologies. Please note that this position mandates working at our (South) Bangalore office for at least 4 out of 5 days, with no remote work option available. Local candidates are preferred, and relocation assistance will not be provided. This is a full-time position based in Bangalore (South), offering a competitive salary range of 500,000-1,200,000 INR per year. If you meet the qualifications and are eager to contribute to our team, we encourage you to apply before the deadline of April 15, 2025.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies