Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
8 - 13 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About Us: KPI Partners is a leading provider of data analytics and business intelligence solutions. We are committed to helping organizations excel through effective data management. Our innovative team focuses on delivering impactful business insights, and we are looking for talented individuals to join us on this journey. Job Summary: We are seeking an experienced ETL Developer with expertise in Oracle Data Integrator (ODI) to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes to extract data from multiple sources and transform it into a format suitable for analysis. You will work closely with business analysts, data architects, and other stakeholders to ensure the successful implementation of data integration solutions. Key Responsibilities: - Design and implement ETL processes using Oracle Data Integrator (ODI) to support data warehousing and business intelligence initiatives. - Collaborate with business stakeholders to gather requirements and translate them into technical specifications. - Develop, test, and optimize ETL workflows, mappings, and packages to ensure efficient data loading and processing. - Perform data quality checks and validations to ensure the accuracy and reliability of transformed data. - Monitor and troubleshoot ETL processes to resolve issues and ensure timely delivery of data. - Document ETL processes, technical specifications, and any relevant workflows. - Stay up-to-date with industry best practices and technology trends related to ETL and data integration. Qualifications: - Bachelor’s degree in Computer Science, Information Technology, or a related field. - Proven experience as an ETL Developer with a focus on Oracle Data Integrator (ODI). - Strong understanding of ETL concepts, data warehousing, and data modeling. - Proficiency in SQL and experience with database systems such as Oracle, SQL Server, or others. - Familiarity with data integration tools and techniques, including data profiling, cleansing, and transformation. - Experience in performance tuning and optimization of ETL processes. - Excellent analytical and problem-solving skills. - Strong communication and teamwork abilities, with a commitment to delivering high-quality results. What We Offer: - Competitive salary and benefits package. - Opportunities for professional growth and career advancement. - A collaborative and innovative work environment. - The chance to work on exciting projects with leading organizations across various industries. KPI Partners is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Posted 2 months ago
2.0 - 5.0 years
3 - 6 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for working on data extraction, transformation, and loading (ETL) processes, ensuring that data flows smoothly between various systems and databases. This role requires to perform data transformation tasks to ensure data accuracy and integrity. Working closely with product owners, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Design, develop, and implement Extract, Transform, Load (ETL) processes to move and transform data from various sources to cloud systems, data warehouses or data lakes. Integrate data from multiple sources (e.g., databases, flat files, cloud services, APIs) into target systems. Develop complex transformations to cleanse, enrich, filter, and aggregate data during the ETL process to meet business requirements. Tune and optimize ETL jobs for better performance and efficient resource usage, minimizing execution time and errors. Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team, and other stakeholders What we expect of you Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Basic Qualifications: Strong expertise in ETL development, data integration and managing complex ETL workflows, performance tuning, and debugging. Strong proficiency in SQL for querying databases, writing scripts, and troubleshooting ETL processes Understanding data modeling concepts, various schemas and normalization Strong understanding of software development methodologies, including Agile and Scrum Experience working in a DevOps environment, which involves designing, developing and maintaining software applications and solutions that meet business needs. Preferred Qualifications: Extensive experience in Informatica PowerCenter or Informatica Cloud for data integration and ETL development Professional Certifications: SAFe® for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Shift Information: This position requires you to work a later shift and will be assigned to the second shift. Candidates must be willing and able to work during evening shifts, as required based on business requirements. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law.
Posted 2 months ago
2.0 - 5.0 years
3 - 6 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will responsible for working on data extraction, transformation, and loading (ETL) processes, ensuring that data flows smoothly between various systems and databases. This role requires to perform data transformation tasks to ensure data accuracy and integrity. Working closely with product owners, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Design, develop, and implement Extract, Transform, Load (ETL) processes to move and transform data from various sources to cloud systems, data warehouses or data lakes. Integrate data from multiple sources (e.g., databases, flat files, cloud services, APIs) into target systems. Develop complex transformations to cleanse, enrich, filter, and aggregate data during the ETL process to meet business requirements. Tune and optimize ETL jobs for better performance and efficient resource usage, minimizing execution time and errors. Identify and resolve technical challenges effectively Stay updated with the latest trends and advancements Work closely with product team, business team, and other stakeholders What we expect of you Bachelor’s degree and 0 to 3 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Basic Qualifications: Expertise in ETL development, data integration and managing complex ETL workflows, performance tuning, and debugging. Proficient in SQL for querying databases, writing scripts, and troubleshooting ETL processes Understanding data modeling concepts, various schemas and normalization Strong understanding of software development methodologies, including Agile and Scrum Experience working in a DevOps environment, which involves designing, developing and maintaining software applications and solutions that meet business needs. Preferred Qualifications: Expertise in Informatica PowerCenter or Informatica Cloud for data integration and ETL development Professional Certifications: SAFe® for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Shift Information: This position requires you to work a later shift and will be assigned to the second shift. Candidates must be willing and able to work during evening shifts, as required based on business requirements. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law.
Posted 2 months ago
3.0 - 7.0 years
4 - 7 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and driving data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to standard methodologies for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree OR Master’s degree and 4 to 6 years of Computer Science, IT or related field OR Bachelor’s degree and 6 to 8 years of Computer Science, IT or related field OR Diploma and 10 to 12 years of Computer Science, IT or related field Preferred Qualifications: Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Strong understanding of data modeling, data warehousing, and data integration concepts Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications (please mention if the certification is preferred or required for the role): AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills Equal opportunity statement Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 2 months ago
2.0 - 5.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Overview Primary focus would be to lead development work within Azure Data Lake environment and other related ETL technologies, with the responsibility of ensuring on time and on budget delivery; Satisfying project requirements, while adhering to enterprise architecture standards. Role will lead key data lake projects and resources, including innovation related initiatives (e.g. adoption of technologies like Databricks, Presto, Denodo, Python,Azure data factory; database encryption; enabling rapid experimentation etc.)). This role will also have L3 and release management responsibilities for ETL processes Responsibilities Lead delivery of key Enterprise Data Warehouse and Azure Data Lake projects within time and budget Drive solution design and build to ensure scalability, performance and reuse of data and other components Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards. Manage work intake, prioritization and release timing; balancing demand and available resources. Ensure tactical initiatives are aligned with the strategic vision and business needs Oversee coordination and partnerships with Business Relationship Managers, Architecture and IT services teams to develop and maintain EDW and data lake best practices and standards along with appropriate quality assurance policies and procedures May lead a team of employee and contract resources to meet build requirements Set priorities for the team to ensure task completion Coordinate work activities with other IT services and business teams. Hold team accountable for milestone deliverables Provide L3 support for existing applications Release management Qualifications Experience Bachelors degree in Computer Science, MIS, Business Management, or related field 9 + years experience in Information Technology or Business Relationship Management 5 + years experience in Data Warehouse/Azure Data Lake 3 years experience in Azure data lake 2 years experience in project management Technical Skills Thorough knowledge of data warehousing / data lake concepts Hands on experience on tools like Azure data factory, databricks, pyspark and other data management tools on Azure Proven experience in managing Data, BI or Analytics projects Solutions Delivery experience - expertise in system development lifecycle, integration, and sustainability Experience in data modeling or database experience; Non-Technical Skills Excellent remote collaboration skills Experience working in a matrix organization with diverse priorities Experience dealing with and managing multiple vendors Exceptional written and verbal communication skills along with collaboration and listening skills Ability to work with agile delivery methodologies Ability to ideate requirements & design iteratively with business partners without formal requirements documentation Ability to budget resources and funding to meet project deliverables
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Job Summary: We are seeking an experienced Data Engineer with expertise in Snowflake and PLSQL to design, develop, and optimize scalable data solutions. The ideal candidate will be responsible for building robust data pipelines, managing integrations, and ensuring efficient data processing within the Snowflake environment. This role requires a strong background in SQL, data modeling, and ETL processes, along with the ability to troubleshoot performance issues and collaborate with cross-functional teams. Responsibilities: Design, develop, and maintain data pipelines in Snowflake to support business analytics and reporting. Write optimized PLSQL queries, stored procedures, and scripts for efficient data processing and transformation. Integrate and manage data from various structured and unstructured sources into the Snowflake data platform. Optimize Snowflake performance by tuning queries, managing workloads, and implementing best practices. Collaborate with data architects, analysts, and business teams to develop scalable and high-performing data solutions. Ensure data security, integrity, and governance while handling large-scale datasets. Automate and streamline ETL/ELT workflows for improved efficiency and data consistency. Monitor, troubleshoot, and resolve data quality issues, performance bottlenecks, and system failures. Stay updated on Snowflake advancements, best practices, and industry trends to enhance data engineering capabilities. Required Skills: Bachelor s degree in Engineering, Computer Science, Information Technology, or a related field. Strong experience in Snowflake, including designing, implementing, and optimizing Snowflake-based solutions. Hands-on expertise in PLSQL, including writing and optimizing complex queries, stored procedures, and functions. Proven ability to work with large datasets, data warehousing concepts, and cloud-based data management. Proficiency in SQL, data modeling, and database performance tuning. Experience with ETL/ELT processes and integrating data from multiple sources. Familiarity with cloud platforms such as AWS, Azure, or GCP is an added advantage. Snowflake certifications (e.g., SnowPro Core, SnowPro Advanced) are a plus. Strong analytical skills, problem-solving abilities, and attention to detail. Excellent communication skills and ability to work effectively in a collaborative environment.
Posted 2 months ago
3.0 - 7.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Overview: We are seeking an experienced and highly skilled Senior Data Engineer to join our team. This role requires a combination of software development and data engineering expertise. The ideal candidate will have advanced knowledge of Python and SQL, a solid understanding of API creation (specifically REST APIs and FastAPI), and experience in building reusable and configurable frameworks. Key Responsibilities: Develop APIs & Microservices: Design, build, and maintain scalable, high-performance REST APIs using FastAPI and other frameworks. Data Engineering: Work on data pipelines, ETL processes, and data processing for robust data solutions. System Architecture: Collaborate on the design and implementation of configurable and reusable frameworks to streamline processes. Collaborate with Cross-Functional Teams: Work closely with software engineers, data scientists, and DevOps teams to build end-to-end solutions that cater to both application and data needs. Slack App Development: Design and implement Slack integrations and custom apps as required for team productivity and automation. Code Quality: Ensure high-quality coding standards through rigorous testing, code reviews, and writing maintainable code. SQL Expertise: Write efficient and optimized SQL queries for data storage, retrieval, and analysis. Microservices Architecture: Build and manage microservices that are modular, scalable, and decoupled. Required Skills & Experience: Programming Languages: Expert in Python, with solid experience building APIs and microservices. Web Frameworks & APIs: Strong hands-on experience with FastAPI and Flask (optional), designing RESTful APIs. Data Engineering Expertise: Strong knowledge of SQL, relational databases, and ETL processes. Experience with cloud-based data solutions is a plus. API & Microservices Architecture: Proven ability to design, develop, and deploy APIs and microservices architectures. Slack App Development: Experience with integrating Slack apps or creating custom Slack workflows. Reusable Framework Development: Ability to design modular and configurable frameworks that can be reused across various teams and systems. Excellent Problem-Solving Skills: Ability to break down complex problems and deliver practical solutions. Software Development Experience: Strong software engineering fundamentals, including version control, debugging, and deployment best practices. Why Join Us Growth Opportunities: You ll work with cutting-edge technologies and continuously improve your technical skills. Collaborative Culture: A dynamic and inclusive team where your ideas and contributions are valued. Competitive Compensation: We offer a competitive salary, comprehensive benefits, and a flexible work environment. Innovative Projects: Be a part of projects that have a real-world impact and help shape the future of data and software development. If you're passionate about working on both data and software engineering, and enjoy building scalable and efficient systems, apply today and help us innovate!
Posted 2 months ago
3.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Overview: We are seeking an experienced and highly skilled Senior Data Engineer to join our team. This role requires a combination of software development and data engineering expertise. The ideal candidate will have advanced knowledge of Python and SQL, a solid understanding of API creation (specifically REST APIs and FastAPI), and experience in building reusable and configurable frameworks. Key Responsibilities: Develop APIs & Microservices: Design, build, and maintain scalable, high-performance REST APIs using FastAPI and other frameworks. Data Engineering: Work on data pipelines, ETL processes, and data processing for robust data solutions. System Architecture: Collaborate on the design and implementation of configurable and reusable frameworks to streamline processes. Collaborate with Cross-Functional Teams: Work closely with software engineers, data scientists, and DevOps teams to build end-to-end solutions that cater to both application and data needs. Slack App Development: Design and implement Slack integrations and custom apps as required for team productivity and automation. Code Quality: Ensure high-quality coding standards through rigorous testing, code reviews, and writing maintainable code. SQL Expertise: Write efficient and optimized SQL queries for data storage, retrieval, and analysis. Microservices Architecture: Build and manage microservices that are modular, scalable, and decoupled. Required Skills & Experience: Programming Languages: Expert in Python, with solid experience building APIs and microservices. Web Frameworks & APIs: Strong hands-on experience with FastAPI and Flask (optional), designing RESTful APIs. Data Engineering Expertise: Strong knowledge of SQL, relational databases, and ETL processes. Experience with cloud-based data solutions is a plus. API & Microservices Architecture: Proven ability to design, develop, and deploy APIs and microservices architectures. Slack App Development: Experience with integrating Slack apps or creating custom Slack workflows. Reusable Framework Development: Ability to design modular and configurable frameworks that can be reused across various teams and systems. Excellent Problem-Solving Skills: Ability to break down complex problems and deliver practical solutions. Software Development Experience: Strong software engineering fundamentals, including version control, debugging, and deployment best practices. Why Join Us Growth Opportunities: You ll work with cutting-edge technologies and continuously improve your technical skills. Collaborative Culture: A dynamic and inclusive team where your ideas and contributions are valued. Competitive Compensation: We offer a competitive salary, comprehensive benefits, and a flexible work environment. Innovative Projects: Be a part of projects that have a real-world impact and help shape the future of data and software development. If you're passionate about working on both data and software engineering, and enjoy building scalable and efficient systems, apply today and help us innovate!
Posted 2 months ago
2.0 - 5.0 years
2 - 6 Lacs
Bengaluru
Work from Office
Job description Job TitleETL Tester Job Responsibilities: Design and execute test cases for ETL processes to validate data accuracy and integrity. Collaborate with data engineers and developers to understand ETL workflows and data transformations. Use Tableau to create visualizations and dashboards that help in data analysis and reporting. Work with Snowflake to test and validate data stored in the cloud data warehouse. Identify, document, and track defects and issues in the ETL process. Perform data profiling and data quality assessments. Create and maintain test documentation, including test plans, test scripts, and test results Exposure to Salesforce and proficiency in developing SQL queries The ideal candidate will have a strong background in ETL processes, data validation, and experience with Tableau and Snowflake. You will be responsible for ensuring the quality and accuracy of data as it moves through the ETL pipeline.
Posted 2 months ago
6.0 - 11.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Analytics Technical Specialist Date 19 May 2025 Location: Bangalore, IN Company Alstom Req ID:486332 STRUCTURE, REPORTING, NETWORKS & LINKS: Organization Structure CITO |-- Data & AI Governance Vice President |-- Enterprise Data Domain Director |-- Head of Analytics Platform |-- Analytics Delivery Architect |-- Analytics Technical Specialist Organizational Reporting Reports to Delivery Manager Networks & Links Internally Transversal Digital Platforms Team. Innovation Team, Application Platform Owners, Business process owners, Infrastructure team Externally Third-party technology providers, Strategic Partners Location :Position will be based in Bangalore Willing to travel occasionally for onsite meetings and team workshops as required RESPONSIBILITIES - Design, develop, and deploy interactive dashboards and reports using MS Fabric & Qlik Cloud, ensuring alignment with business requirements and goals. Implement and manage data integration workflows utilizing MS Fabric to ensure efficient data processing and accessibility. Translate business needs to technical specifications and Design, build and deploy solutions. Understand and integrate Power BI reports into other applications using embedded analytics like Power BI service (SaaS), Teams, SharePoint or by API automation. Will be responsible for access management of app workspaces and content. Integration of PowerBi servers with different data sources and timely upgradation/services of PowerBi Able to schedule and refresh jobs on Power BI On-premise data gateway. Configure standard system reports, as well as customized reports as required. Responsible in helping various kind of database connections (SQL, Oracle, Excel etc.) with Power BI Services Investigate and troubleshoot reporting issues and problems Maintain reporting schedule and document reporting procedures Monitor and troubleshoot data flow issues, optimizing the performance of MS Fabric applications as needed. Optimize application performance and data models in Qlik Cloud while ensuring data accuracy and integrity. Ensure collaboration with Functional & Technical Architectsasbusiness cases are setup for each initiative,collaborate with other analytics team to drive and operationalize analytical deployment.Maintain clear and coherent communication, both verbal and written, to understand data needs and report results. Ensure compliance with internal policies and regulations Strong ability to take the lead and be autonomous Proven planning, prioritization, and organizational skills.Ability to drive change through innovation & process improvement. Be able to report to management and stakeholders in a clear and concise manner. Good to havecontribition to the integration and utilization of Denodo for data virtualization, enhancing data access across multiple sources. Document Denodo processes, including data sources and transformations, to support knowledge sharing within the team. Facilitate effective communication with stakeholders regarding project updates, risks, and resolutions to ensure transparency and alignment. Participate in team meetings and contribute innovative ideas to improve reporting and analytics solutions. EDUCATION Bachelor s/Master s degree in Computer Science Engineering /Technology or related field Experience Minimum 3 and maximum 6 years of total experience Mandatory 2+ years of experience in Power BI End-to-End Development using Power BI Desktop connecting multiple data sources (SAP, SQL, Azure, REST APIs, etc.) Experience in MS Fabric Components along with Denodo. Technical competencies Proficient in using MS Fabric for data integration and automation of ETL processes. Understanding of data governance principles for quality and security. Strong expertise in creating dashboards and reports using Power BI and Qlik. Knowledge of data modeling concepts in Qlik and Power BI. Proficient in writing complex SQL queries for data extraction and analysis. Skilled in utilizing analytical functions in Power BI and Qlik. Experience in troubleshooting performance issues in MS Fabric and Denodo. Experience in Developing visual reports, dashboards and KPI scorecards using Power BI desktop & Qlik Understand Power BI application security layer model. Hands on PowerPivot, Role based data security, Power Query, Dax Query, Excel, Pivots/Charts/grid and Power View Good to have Power BI Services and Administration knowledge. Experience in developing data models using Denodo to support business intelligence and analytics needs. Proficient in creating base views and derived views for effective data representation. Ability to implement data transformations and enrichment within Denodo. Skilled in using Denodo's SQL capabilities to write complex queries for data retrieval. Familiarity with integrating Denodo with various data sources, such as databases, web services, and big data platforms. BEHAVIORAL COMPETENCIES The candidate should demonstrate: A strong sense for collaboration and being a team player Articulate issues and propose solutions. Structured thought process and articulation Critical thinking and problem-solving skills. Analytical bent of mind and be willing to question the status quo Possess excellent soft skills. Individual contributor and proactive and have leadership skills. Be able to guide and drive team from technical standpoint. Excellent written, verbal, and interpersonal skills. Self-motivated, quick learner is a must. Be fluent in English. Be able to influence and deliver. You don t need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, you ll be proud. If you re up for the challenge, we d love to hear from you! Important to note
Posted 2 months ago
7.0 - 12.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Lead Python Developer Experience - 7+Years Location - Bangalore/Hyderabad Job Overview We are seeking an experienced and highly skilled Senior Data Engineer to join our team. This role requires a combination of software development and data engineering expertise. The ideal candidate will have advanced knowledge of Python and SQL, a solid understanding of API creation (specifically REST APIs and FastAPI), and experience in building reusable and configurable frameworks. Key Responsibilities: Develop APIs & Microservices Design, build, and maintain scalable, high-performance REST APIs using FastAPI and other frameworks. Data Engineering Work on data pipelines, ETL processes, and data processing for robust data solutions. System Architecture Collaborate on the design and implementation of configurable and reusable frameworks to streamline processes. Collaborate with Cross-Functional Teams Work closely with software engineers, data scientists, and DevOps teams to build end-to-end solutions that cater to both application and data needs. Slack App Development Design and implement Slack integrations and custom apps as required for team productivity and automation. Code Quality Ensure high-quality coding standards through rigorous testing, code reviews, and writing maintainable code. SQL Expertise Write efficient and optimized SQL queries for data storage, retrieval, and analysis. Microservices Architecture Build and manage microservices that are modular, scalable, and decoupled. Required Skills & Experience: Programming Languages Expert in Python, with solid experience building APIs and microservices. Web Frameworks & APIs Strong hands-on experience with FastAPI and Flask (optional), designing RESTful APIs. Data Engineering Expertise Strong knowledge of SQL, relational databases, and ETL processes. Experience with cloud-based data solutions is a plus. API & Microservices Architecture Proven ability to design, develop, and deploy APIs and microservices architectures. Slack App Development Experience with integrating Slack apps or creating custom Slack workflows. Reusable Framework Development: Ability to design modular and configurable frameworks that can be reused across various teams and systems. Excellent Problem-Solving Skills: Ability to break down complex problems and deliver practical solutions. Software Development Experience Strong software engineering fundamentals, including version control, debugging, and deployment best practices. Why Join Us Growth Opportunities You ll work with cutting-edge technologies and continuously improve your technical skills. Collaborative Culture A dynamic and inclusive team where your ideas and contributions are valued. Competitive Compensation We offer a competitive salary, comprehensive benefits, and a flexible work environment. Innovative Projects Be a part of projects that have a real-world impact and help shape the future of data and software development. If you're passionate about working on both data and software engineering, and enjoy building scalable and efficient systems, apply today and help us innovate!
Posted 2 months ago
3.0 - 6.0 years
5 - 8 Lacs
Bengaluru
Work from Office
The ideal candidate will have experience in data privacy and protection, with a strong understanding of regulations such as GDPR, CCPA, and ISO 27001. They will be responsible for ensuring that the organizations data handling practices are compliant with these regulations, and will work to implement data governance policies and procedures to protect sensitive information. The successful candidate will have strong data analysis skills, with the ability to collect and analyze data to inform data governance and consent management decisions. They will be responsible for developing and implementing consent management processes, and will work to ensure that the organization is transparent and compliant in its data handling practices, with a focus on data privacy and protection. Ab out the Role: In this opportunity as the Data Analyst and Privacy Operations, you will: We are seeking a detail-oriented and experienced Data Analyst to join our Data Integrity & Identity team, with a focus on Privacy Operations. This team is an enabling function within our Responsible AI pillar, responsible for ensuring the organizations compliance with data protection regulations such as GDPR, CCPA, and ISO 27001. The team is focused on helping business functions implement technology and automated solutions to enhance compliance with personal data legislation and adhere to data management best practices. The Data Analyst will support the data integrity and identity program by applying privacy by design principles on a global scale, with a focus on data privacy, data protection, and consent management. This individual will assist with a variety of projects, including consent management, cookie compliance, data residency, and enterprise identity, ensuring that each aligns with our privacy objectives. This role is ideal for someone with a keen interest in how technology and automation can be leveraged to improve the handling of personal data on a global scale, with a focus on data privacy, data protection, and consent management. The successful candidate will demonstrate a strong understanding of data analysis, data governance, and data protection principles About You: Youre a fit for the role of Data Analyst, if you meet all or most of these criteria: A minimum of 3-6 years in a data analyst role, preferably within the data governance or privacy space. Key responsibilities include preparing engaging socialization materials to communicate the programs value and achievements to stakeholders, leveraging their proficiency in MS Office (Excel and PowerPoint), Azure DevOps, PowerBI, and Lucidchart. Additionally, the analyst should be able to demonstrate a strong technical understanding that would enable them to execute tasks like cookie scanning and categorization and the manipulation of extensive data sets and CSV files, to support our reporting requirements. Seeking candidates with proven experience in data migration projects, ensuring smooth data transfer between systems while upholding data integrity and adhering to privacy regulations. Looking for individuals with strong skills in data mapping to effectively align data sources with target systems, guaranteeing accurate and consistent data representation across different platforms. Require proficiency in managing and analyzing large datasets, utilizing tools such as Power BI and advanced Excel techniques to facilitate data-driven insights and support decision-making. Candidates should have experience or a keen interest in working with consent management solutions, focusing on enhancing practices and ensuring compliance with data handling regulations. A degree in Information Technology, Data Science, Computer Science, or a related field. Proficiency in data management tools and ETL processes, with experience using Power BI and data visualisation platforms. Advanced expertise in MS Office, particularly MS Excel, to effectively manage and analyze large CSV files and extensive datasets, ensuring accurate data manipulation and reporting. Experience with large-scale data handling, including working with extensive datasets and CSV files, to support efficient data manipulation and reporting. Strong analytical and problem-solving skills to support data integration, mapping, migration and data residency projects. Keen attention to detail to ensure data accuracy and integrity throughout the handling and reporting processes. This position plays a vital role in ensuring our data integrity and identity practices are strong, compliant, and aligned with our strategic objectives. If you are a creative thinker passionate about enhancing personal data controls, we encourage you to apply. #LI-VGA1 What’s in it For You Hybrid Work Model We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 2 months ago
8.0 - 13.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Location Bengaluru : We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in data engineering, with a strong focus on Databricks, Python, and SQL. As a Data Engineer, you will play a crucial role in designing, developing, and maintaining our data infrastructure to support various business needs. Key Responsibilities Develop and implement efficient data pipelines and ETL processes to migrate and manage client, investment, and accounting data in Databricks Work closely with the investment management team to understand data structures and business requirements, ensuring data accuracy and quality. Monitor and troubleshoot data pipelines, ensuring high availability and reliability of data systems. Optimize database performance by designing scalable and cost-effective solutions. What s on offer Competitive salary and benefits package. Opportunities for professional growth and development. A collaborative and inclusive work environment. The chance to work on impactful projects with a talented team. Candidate Profile Experience: 8+ years of experience in data engineering or a similar role. Proficiency in Apache Spark. Databricks Data Cloud, including schema design, data partitioning, and query optimization Exposure to Azure. Exposure to Streaming technologies. (e.g Autoloader, DLT Streaming) Advanced SQL, data modeling skills and data warehousing concepts tailored to investment management data (e.g., transaction, accounting, portfolio data, reference data etc). Experience with ETL/ELT tools like snap logic and programming languages (e.g., Python, Scala, R programing). Familiarity workload automation and job scheduling tool such as Control M. Familiar with data governance frameworks and security protocols. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Education Bachelor s degree in computer science, IT, or a related discipline. Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.
Posted 2 months ago
3.0 - 5.0 years
32 - 37 Lacs
Mumbai
Work from Office
: Job TitleLead Business Analyst, AVP Location: Mumbai, India Role Description As a BA you are expected to design and deliver on critical senior management dashboards and analytics using tools such as Tableau, Power BI etc. These management packs should enable management to make timely decisions for their respective businesses and create a sound foundation for the analytics. You will need to collaborate closely with senior business managers, data engineers and stakeholders from other teams to comprehend requirements and translate them into visually pleasing dashboards and reports. You will play a crucial role in analyzing business data and generating valuable insights for other strategic ad hoc exercises. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Collaborate with business user, managers to gather requirements, and comprehend business needs to design optimal solutions. Perform ad hoc data analysis as per business needs to generate reports, visualizations, and presentations helping strategic decision making. You will be responsible for sourcing information from multiple sources, build a robust data pipeline model. To be able work on large and complex data sets to produce useful insights. Perform audit checks ensuring integrity and accuracy across all spectrums before implementing findings. Ensure timely refresh to provide most updated information in dashboards/reports. Identifying opportunities for process improvements and optimization based on data insights. Communicate project status updates and recommendations. Your skills and experience Bachelors degree in computer science, IT, Business Administration or related field Minimum of 5 years of experience in visual reporting development, including hands-on development of analytics dashboards and working with complex data sets Minimum of 3 years of Tableau, power BI or any other BI tool. Excellent Microsoft Office skills including advanced Excel skills . Comprehensive understanding of data visualization best practices Experience with data analysis, modeling, and ETL processes is advantageous. Excellent knowledge of database concepts and extensive hands-on experience working with SQL Strong analytical, quantitative, problem solving and organizational skills. Attention to detail and ability to coordinate multiple tasks, set priorities and meet deadlines. Excellent communication and writing skills. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 2 months ago
6.0 - 10.0 years
2 - 5 Lacs
Chennai
Work from Office
Job Information Job Opening ID ZR_1999_JOB Date Opened 17/06/2023 Industry Technology Job Type Work Experience 6-10 years Job Title ETL Tester City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 1 Create test case documents/plan for testing the Data pipelines. Check the mapping for the fields that support data staging and in data marts & data type constraints of the fields present in snowflake Verify non-null fields are populated. Verify Business requirements and confirm if the correct logic Is implemented in the transformation layer of ETL process. Verify stored procedure calculations and data mappings. Verify data transformations are correct based on the business rules. Verify successful execution of data loading workflows check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 2 months ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
We are looking for a Data Engineer with experience in data warehouse projects, strong expertise in Snowflake , and hands-on knowledge of Azure Data Factory (ADF) and dbt (Data Build Tool). Proficiency in Python scripting will be an added advantage. Key Responsibilities: Design, develop, and optimize data pipelines and ETL processes for data warehousing projects. Work extensively with Snowflake, ensuring efficient data modeling, and query optimization. Develop and manage data workflows using Azure Data Factory (ADF) for seamless data integration. Implement data transformations, testing, and documentation using dbt. Collaborate with cross-functional teams to ensure data accuracy, consistency, and security. Troubleshoot data-related issues. (Optional) Utilize Python for scripting, automation, and data processing tasks. Required Skills & Qualifications: Experience in Data Warehousing with a strong understanding of best practices. Hands-on experience with Snowflake (Data Modeling, Query Optimization). Proficiency in Azure Data Factory (ADF) for data pipeline development. Strong working knowledge of dbt (Data Build Tool) for data transformations. (Optional) Experience in Python scripting for automation and data manipulation. Good understanding of SQL and query optimization techniques. Experience in cloud-based data solutions (Azure). Strong problem-solving skills and ability to work in a fast-paced environment. Experience with CI/CD pipelines for data engineering. Why Join Us Opportunity to work on cutting-edge data engineering projects. Work with a highly skilled and collaborative team. Exposure to modern cloud-based data solutions. ------ ------Developer / Software Engineer - One to Three Years,Snowflake - One to Three Years------PSP Defined SCU in Solution Architect
Posted 2 months ago
4.0 - 6.0 years
2 - 5 Lacs
Pune
Work from Office
Job Information Job Opening ID ZR_1764_JOB Date Opened 23/03/2023 Industry Technology Job Type Work Experience 4-6 years Job Title PAPM City Pune Province Maharashtra Country India Postal Code 411013 Number of Positions 1 BPC Embedded 10.1 on HANA Creation/Designing of BW objects such as InfoObjects, InfoCubes, Providers and Data Store Object (ADSO). Knowledge on BW ETL process and maintaining master data and hierarchy for BW objects. Deep knowledge on Query Designer/ECLIPSE. Knowledge on BW Process Chain Run,Schedule , Monitoring and troubleshooting. Understanding of AFO Template creation/update including macro programming for AFO. Knowledge on the planning area/model such as aggregation level,planning filter,planning function, planning sequence as a part of RSPLAN. Knowledge on characteristic relationships and data slices. Knowledge of FOX programming and ABAP. Location :Pan India check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 2 months ago
0.0 - 1.0 years
3 - 6 Lacs
Pune
Work from Office
Role & responsibilities Key Responsibilities: Provide production support for SQL Server databases, ensuring high availability and performance. Develop and implement change requests for SQL Server databases based on business requirements. Develop and maintain stored procedures and user-defined functions. Collaborate with cross-functional teams to understand and address data-related issues. Optimize and tune SQL queries to improve performance. Maintain and update database documentation. Utilize Informatica for ETL processes and data integration tasks. Troubleshoot and resolve database-related issues promptly. Ensure data integrity and security across all database systems. Qualifications we seek in you! Minimum Qualifications: Bachelors degree in computer science, Information Technology, or a related field. Rich experience in SQL Server development. Good experience with Informatica. Strong knowledge of SQL programming and database management. Experience with performance tuning and query optimization. Familiarity with ETL processes and data integration using Informatica. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Skills: Experience with Python is a plus. Knowledge of data warehousing concepts. Familiarity with Agile methodologies. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 2 months ago
4.0 - 7.0 years
6 - 10 Lacs
Gurugram
Work from Office
Public Services Industry Strategist Join our team in Strategy for an exciting career opportunity to work on the Industry Strategy agenda of our most strategic clients across the globe! Practice: Industry Strategy, Global Network (GN) Areas of Work: Strategy experience in Public Services Industry " Operating Model and Organization Design, Strategic Roadmap Design, Citizen Experience, Business Case Development (incl Financial Modelling), Transformation office, Sustainability, Digital Strategy, Data Strategy, Gen AI, Cloud strategy, Cost Optimization strategy Domain:Public Services " Social Services, Education, Global Critical Infrastructure Services, Revenue, Post & Parcel Level: Consultant Location: Gurgaon, Mumbai, Bengaluru, Chennai, Kolkata, Hyderabad & Pune Years of Exp: 4-7 years of strategy experience post MBA from a Tier 1 institute Explore an Exciting Career at Accenture Do you believe in creating an impact? Are you a problem solver who enjoys working on transformative strategies for global clients? Are you passionate about being part of an inclusive, diverse, and collaborative culture? Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Strategy. The Practice- A Brief Sketch: The GNStrategy Industry Group is a part of Accenture Strategy and focuses on the CXOs most strategic priorities. We help clients with strategies that are at the intersection of business and technology, drive value and impact, shape new businesses & design operating models for the future. As a part of this high performing team, you will: Apply advanced corporate finance to drive value using financial levers, value case shaping, and feasibility studies to evaluate new business opportunities Analyze competitive benchmarking to advise C-suite on 360 value opportunities, scenario planning to solve complex C-suite questions, lead & enable strategic conversations Identify strategic cost take-out opportunities, drive business transformation, and suggest value-based decisions based on insights from data. Apply advanced data analyses to unlock client value aligned with clients business strategy Build future focused PoV and develop strategic ecosystem partners. Build Client Strategy definition leveraging Disruptive technology solutions, like Data & AI, including Gen AI, and Cloud Build relationships with C-suite executives and be a trusted advisor enabling clients to realize value of human-centered change Create Thought Leadership in Industry/Functional areas, Reinvention Agendas, Solution tablets and assets for value definition, and use it, along with your understanding of Industry value chain and macroeconomic analyses, to inform clients strategy Partner with CXOs to architect future proof operating models embracing Future of Work, Workforce and Workplace powered by transformational technology, ecosystems and analytics Work with our ecosystem partners, help clients reach their sustainability goals through digital transformation Prepare and deliver presentations to clients to communicate strategic plans and recommendations on PS domains such as Digital Citizen, Public Infrastructure, Smart Buildings, Net Zero Monitor industry trends and keep clients informed of potential opportunities and threats The candidate will be required to have exposure to core-strategy projects in Public Services domain with a focus on one of the sub-industries within the Public Service (mentioned below), specifically: Public Service Experience: The candidate must have strategy experience in at least one of the below Public Service sub-industries: Social Services + (Employment, Pensions, Education, Child welfare, Government as a platform, Digital Citizen Services) Education Global Critical Infrastructure Services (Urban & city planning, Smart Cities, High Performing City Operating Model) Admin (Citizen experience, Federal Funds Strategy, Workforce Strategy, Intelligent Back Office, Revenue industry strategy, Post & Parcel) Strategy Skills and Mindsets Expected: A Strategic Mindset to shape innovative, fact-based strategies and operating models Communication and Presentation Skills to hold C-Suite influential dialogues, narratives, conversations, and share ideas Ability to solve problems in unstructured scenarios, to decode and solve complex and unstructured business questions Analytical and outcome-driven approach to perform data analyses & generate insights, and application of these insights for strategic insights and outcomes Qualifications Value Driven Business Acumen to drive actionable outcomes for clients with the latest industry trends, innovations and disruptions, metrics and value drivers Financial Acumen and Value Creation to develop relevant financial models to back up a business case Articulation of strategic and future vision Ability to identify Technology Disruptions in the Public Services industry What's in it for you? An opportunity to work on transformative projects with key G20OO clients and CxOs Potential to co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all Engage in boundaryless collaboration across the entire organization About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the world's largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With more than 732,000 p eople serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at About Accenture Strategy & Consulting: Accenture Strategy shapes our clients' future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategy's services include those provided by our Global Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Global Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit en /careers/local/capability-network- careers Accenture Global Network | AGcenture in One Word At the heart of every great change is a great human. If you have ideas, ingenuity and a passion for making a difference, come and be a part of our team .
Posted 2 months ago
5.0 - 10.0 years
7 - 12 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements in a dynamic work environment. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior professionals Stay updated with the latest technologies and trends Conduct regular knowledge sharing sessions Professional & Technical Skills: Must To Have Skills:Proficiency in SAP BusinessObjects Data Services Strong understanding of ETL processes Experience with data integration and data quality management Hands-on experience in developing and optimizing data workflows Knowledge of data warehousing concepts Additional Information: The candidate should have a minimum of 5 years of experience in SAP BusinessObjects Data Services This position is based at our Chennai office A 15 years full-time education is required Qualifications 15 years full time education
Posted 2 months ago
3.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role Application Developer Project Role Description Design, build and configure applications to meet business process and application requirements. Must have skills Ab Initio Good to have skills NA Minimum 3 year(s) of experience is required Educational Qualification 15 years full time education SummaryAs an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions to enhance business operations and efficiency. Roles & Responsibilities- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to understand project requirements.- Develop and implement software solutions to meet business needs.- Conduct code reviews and provide feedback to team members.- Troubleshoot and debug applications to ensure optimal performance.- Stay updated on industry trends and technologies to suggest improvements. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of ETL processes and data integration.- Experience with data warehousing concepts and methodologies.- Hands-on experience in developing and optimizing ETL workflows.- Knowledge of SQL and database management systems. Additional Information- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
5.0 - 10.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities: Gather, process, and analyze data to generate meaningful insights that drive strategic decision-making. Design and implement efficient data collection systems to enhance accuracy and optimize workflow. Utilize statistical techniques to interpret data and create detailed reports that support business goals. Work closely with cross-functional teams to solve business challenges through data-driven solutions. Strong proficiency in SQL/SOQL for data querying and analysis. Professional & Technical Skills: Must To Have Skills: Proficiency in Informatica Data Quality. Strong understanding of data modeling and data architecture. Experience with SQL and database management systems. Hands-on experience with data integration tools. Knowledge of data warehousing concepts. Additional Information: The candidate should have a minimum of 5 years of experience in Informatica Data Quality. This position is based at our Hyderabad office. A 15 years full-time education is required. Qualification 15 years full time education
Posted 2 months ago
5.0 - 10.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Lead the design and implementation of data solutions. Optimize and troubleshoot ETL processes. Conduct data analysis and provide insights for decision-making. Professional & Technical Skills: Must To Have Skills: Proficiency in Talend ETL. Strong understanding of data modeling and database design. Experience with data integration and data warehousing concepts. Hands-on experience with SQL and scripting languages. Knowledge of cloud platforms and big data technologies. Additional Information: The candidate should have a minimum of 5 years of experience in Talend ETL. This position is based at our Hyderabad office. A 15 years full-time education is required. Qualification 15 years full time education
Posted 2 months ago
5.0 - 10.0 years
13 - 18 Lacs
Gurugram
Work from Office
Project Role : Quality Engineering Lead (Test Lead) Project Role Description : Leads a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Applies business and functional knowledge to develop end-to-end testing strategies through the use of quality processes and methodologies. Applies testing methodologies, principles and processes to define and implement key metrics to manage and assess the testing process including test execution and defect resolution. Must have skills : Data Warehouse ETL Testing Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Quality Engineering Lead (Test Lead), you will lead a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Apply business and functional knowledge to develop end-to-end testing strategies using quality processes and methodologies. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Lead team planning and ecosystem integration. Develop end-to-end testing strategies. Define and implement key metrics to manage and assess the testing process. Professional & Technical Skills: Must To Have Skills: Proficiency in Data Warehouse ETL Testing. Strong understanding of ETL processes. Experience in data validation and reconciliation. Knowledge of SQL for data querying and validation. Must to have additional skill:Python. Additional Information: The candidate should have a minimum of 5 years of experience in Data Warehouse ETL Testing. This position is based at our Gurugram office. A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
12.0 - 17.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer Lead, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. A typical day involves working on data solutions and ETL processes. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Lead data architecture design. Implement data integration solutions. Optimize ETL processes. Professional & Technical Skills: Must To Have Skills: Proficiency in Talend ETL. Strong understanding of data modeling. Experience with SQL and database management. Knowledge of cloud platforms like AWS or Azure. Hands-on experience with data warehousing. Good To Have Skills: Experience with data visualization tools. Additional Information: The candidate should have a minimum of 12 years of experience in Talend ETL. This position is based at our Hyderabad office. A 15 years full-time education is required. Qualification 15 years full time education
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France