Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Global Data Steward at Axalta, you will be responsible for managing master data objects such as creation, update, obsolescence, reactivation to ensure the smooth operation of business processes. You will interact with various business units to clarify requests and ensure accurate data maintenance within defined timelines. Your role will also involve testing master data across tools and interfaces, engaging in additional projects, and mentoring team members as needed. Flexibility to work in shifts is required to meet the deliverables effectively. Key requirements for this role include hands-on experience in master data creation and maintenance, particularly in areas like Material, Vendor, Pricing, Customer, PIRs, Source List, BOM data, etc. Proficiency in SAP toolsets related to data management, data extraction programs, ETL processes, load programs, data quality maintenance, and cleansing is essential. Knowledge of Request Management tools, database concepts, data models, and relationships between different data types is crucial. Familiarity with SAP, S/4 HANA, SAP-MDG, Ariba, and other ERP platforms is desirable. Additionally, experience in Data Management Processes, functional knowledge in SAP MM/PP or OTC modules, and professional experience of around 5-6 years are preferred. Good communication skills, stakeholder alignment, and the ability to interact with international colleagues are important aspects of this role. You should possess a strong focus on ownership, drive to excel, deliver results, resolve conflicts, collaborate effectively, and work as a team player. Axalta, a leading player in the coatings industry, operates in two segments: Performance Coatings and Mobility Coatings, catering to markets like Refinish, Industrial, Light Vehicle, and Commercial Vehicle across various regions globally. The company has set ambitious sustainability goals for the future, including carbon neutrality by 2040, and is dedicated to collaborating with customers to optimize their businesses and achieve mutual goals. With a rich portfolio of brands and a global presence in over 140 countries, Axalta continues to innovate solutions that protect and beautify products while contributing to a sustainable future.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Senior Sales Analytics Specialist at NTT DATA, you will play a crucial role in driving the success of sales operations through comprehensive data analysis, valuable insights, and strategic decision support. Working within a multifaceted environment, you will collaborate with cross-functional teams to provide data-driven support for business planning and strategic decision-making by leveraging a deep understanding of the business context. Your key responsibilities will include driving tactical and strategic projects with virtual teams, analyzing complex business problems using internal and external data, defining and tracking metrics and dashboard requirements, and providing strategic decision support to help the team answer critical questions and design new initiatives. You will also be responsible for data validation, creating reports, and presenting actionable insights to stakeholders. To excel in this role, you will need to have an advanced understanding of data analysis techniques, business sales objectives, market dynamics, and industry trends. Strong collaboration skills, the ability to translate complex data insights into actionable strategies, and excellent communication and presentation skills are essential. Proficiency in data analysis tools such as Excel, PowerBI, and coding languages, as well as knowledge of SQL, data security best practices, and ETL processes, will be key to your success. You should hold a Bachelor's degree or equivalent in Data Science or a related field, with relevant sales analytics certifications being desirable. Additionally, you should have demonstrated experience in a sales or marketing function as a data analyst, proficiency in PowerBI and statistical analysis techniques, and a proven track record in creating and optimizing reports that contribute to strategic decision support. This is a remote working position at NTT DATA, a trusted global innovator of business and technology services committed to helping clients innovate, optimize, and transform for long-term success. With a diverse team of experts in over 50 countries, NTT DATA invests significantly in R&D to support organizations in moving confidently into the digital future. As a Global Top Employer, NTT DATA offers a wide range of services including business and technology consulting, data and artificial intelligence solutions, and the development and management of applications and infrastructure. Join us at NTT DATA and be part of a leading provider of digital and AI infrastructure in the world.,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a Data Engineer at Birlasoft, a global leader in Cloud, AI, and Digital technologies, you will play a crucial role in designing and developing data transformations and data models. Your primary responsibility will be to ensure reliable and efficient data processing and analysis to support data-driven decision-making processes. Working closely with cross-functional teams, you will contribute to the overall success of our insights teams. Your key proficiency should include expertise in DBT (Data Build Tool) for data transformation and modelling. You must demonstrate proficiency in Snowflake, including experience with Snowflake SQL and data warehousing concepts. A strong understanding of data architecture, data modelling, and data warehousing best practices is essential for this role. In this position, you will design, develop, and maintain robust data pipelines using DBT and Snowflake. You will be responsible for implementing and optimizing data ingestion processes to ensure efficient and accurate data flow from various sources. Collaboration with data scientists, analysts, and stakeholders is crucial to understand data requirements and ensure data integrity and quality. As a Data Engineer, you should have proven experience in data ingestion and ETL processes. Experience with other ETL tools and technologies like Apache Airflow, Talend, or Informatica is a plus. Proficiency in SQL and experience with programming languages such as Python or Java are required. Familiarity with cloud platforms and services, especially AWS, and experience with AWS Lambda is a must. You are expected to adhere to and promote development best practices, including version control using Git and branching models. Code review to ensure consistent coding standards and practices is part of your responsibilities. Participation in scrum methodology, including daily stand-ups, sprint planning, and retrospectives, is essential. Effective communication with team members and stakeholders to understand requirements and provide updates is also key. Ownership of assigned tasks and the ability to work independently to complete them are characteristics that will contribute to your success in this role. Staying up to date with the latest trends and technologies in data engineering, DBT, and Snowflake is important to ensure continuous improvement and innovation in your work.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
The purpose of this role is to support process delivery by ensuring the daily performance of the Production Specialists, resolving technical escalations, and developing technical capability within the Production Specialists. You should have a strong proficiency in SQL, a solid understanding of ETL processes and data warehousing concepts, hands-on experience with Informatica and Teradata, exposure to any Reporting tool, strong analytical and problem-solving skills. Preferably, you should have knowledge of Python and familiarity with Google Cloud Platform (GCP) would be a plus. The ideal candidate should have 5-8 years of experience in a similar role. The location for this position is in Bangalore. Join us at Wipro, where we are building a modern Wipro and are an end-to-end digital transformation partner with bold ambitions. We are looking for individuals inspired by reinvention, eager to evolve themselves, their careers, and their skills. We believe in constant evolution as the world changes around us. Be part of a purpose-driven business that empowers you to design your reinvention. Realize your ambitions at Wipro. We welcome applications from people with disabilities.,
Posted 5 days ago
3.0 - 5.0 years
0 Lacs
bengaluru, karnataka, india
On-site
You're ready to gain the skills and experience needed to grow within your role and advance your career - and we have the perfect software engineering opportunity for you. As a Software Engineer II at JPMorgan Chase as a part of Corporate and Investment Bank, you are part of an agile team that works to enhance, design, and deliver the software components of the firm's state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics. Frequently utilizes SQL and understands NoSQL databases and their niche in the marketplace Collaborate closely with cross-functional teams to develop efficient data pipelines to support various data-driven initiatives Implement best practices for data engineering, ensuring data quality, reliability, and performance Contribute to data modernization efforts by leveraging cloud solutions and optimizing data processing workflows Perform data extraction and implement complex data transformation logic to meet business requirements Leverage advanced analytical skills to improve data pipelines and ensure data delivery is consistent across projects Monitor and executes data quality checks to proactively identify and address anomalies. Ensure data availability and accuracy for analytical purposes Identify opportunities for process automation within data engineering workflows Communicate technical concepts to both technical and non-technical stakeholders. Deploy and manage containerized applications using Kubernetes (EKS) and Amazon ECS. Implement data orchestration and workflow automation using AWS step , Event Bridge. Use Terraform for infrastructure provisioning and management, ensuring a robust and scalable data infrastructure. Required qualifications, capabilities, and skills Formal training or certification on Data Engineering concepts and 3+ years applied experience. Experience across the data lifecycle Advanced at SQL (e.g., joins and aggregations) Advanced knowledge of RDBMS like Aurora. Experience in Microservice based component using ECS or EKS Working understanding of NoSQL databases. 4 + years of Data Engineering experience in building and optimizing data pipelines, architectures, and data sets ( Glue or Databricks etl) Proficiency in object-oriented and object function scripting languages (Python etc.) Experience in developing ETL process and workflows for streaming data from heterogeneous data sources Willingness and ability to learn and pick up new skillsets Experience working with modern DataLakes: Databricks ) Experience building Pipeline on AWS using Terraform and using CI/CD piplelines Preferred qualifications, capabilities, and skills Experience with data pipeline and workflow management tools (Airflow, etc.) Strong analytical and problem-solving skills, with attention to detail. Ability to work independently and collaboratively in a team environment. Good communication skills, with the ability to convey technical concepts to non-technical stakeholders. A proactive approach to learning and adapting to new technologies and methodologies.
Posted 5 days ago
1.0 - 3.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Senior Principal Consultant - AWS Data & Analytics Architect Job Description: Role Overview: We are seeking an experienced AWS Data & Analytics Architect with a strong background in delivery and excellent communication skills. The ideal candidate will have experience and a proven track record in managing teams and client relationships. You will be responsible for leading data modernization and transformation projects using AWS services. Key Responsibilities: . Lead and architect data modernization/transformation projects using AWS services. . Manage and mentor a team of data engineers and analysts. . Build and maintain strong client relationships, ensuring successful project delivery. . Design and implement scalable data architectures and solutions. . Oversee the migration of large datasets to AWS, ensuring data integrity and security. . Collaborate with stakeholders to understand business requirements and translate them into technical solutions. . Ensure best practices in data management and governance are followed. Qualifications we seek in you! Minimum Qualifications . experience in data architecture and analytics. . Hands-on experience with AWS services such as Redshift, S3, Glue, Lambda, RDS, and others. . Proven experience in delivering 1-2 large data migration/modernization projects using AWS. . Strong leadership and team management skills. . Excellent communication and interpersonal skills. . Deep understanding of data modeling, ETL processes, and data warehousing. . Experience with data governance and security best practices. . Ability to work in a fast-paced, dynamic environment. Preferred Qualifications: . AWS Certified Solutions Architect - Professional or AWS Certified Big Data - Specialty. . Experience with other cloud platforms (e.g., Azure, GCP) is a plus. . Familiarity with machine learning and AI technologies. Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 6 days ago
0.0 - 2.0 years
1 - 2 Lacs
bhubaneswar, odisha, india
On-site
Description We are seeking a motivated SQL Developer with 0-2 years of experience to join our growing team in India. The ideal candidate will have a strong foundation in SQL and database management, with a passion for data and a desire to learn and grow within the role. Responsibilities Design, develop, and maintain SQL databases to support business applications. Write efficient SQL queries for data retrieval, manipulation, and analysis. Collaborate with other team members to gather requirements and provide technical solutions. Ensure data integrity and security within SQL databases. Optimize database performance and troubleshoot issues as they arise. Assist in the development and execution of data migration strategies. Skills and Qualifications Proficient in SQL and database management systems like MySQL, PostgreSQL, or Microsoft SQL Server. Understanding of database design principles and normalization. Familiarity with data modeling tools and techniques. Basic knowledge of programming languages such as Python or Java is a plus. Ability to troubleshoot and optimize SQL queries for performance. Strong analytical and problem-solving skills. Good communication skills and ability to work in a team.
Posted 6 days ago
15.0 - 18.0 years
3 - 6 Lacs
bengaluru, karnataka, india
On-site
Qualifications: 15+ years of experience with deep expertise in data architecture principles, data modeling, data integration, data governance, and data management technologies. Proven experience in formulating data strategies and developing logical and physical data models on RDBMS, NoSQL, and cloud-native databases. Expertise in one or more RDBMS platforms (Oracle, DB2, SQL Server, etc.) and experience implementing multiple data models with data security and access control mechanisms. Demonstrated ability to lead large, complex teams and manage cross-functional collaboration. Key Responsibilities: Analyze and understand business requirements, translating them into conceptual, logical, and physical data models. Serve as a principal advisor on data architecture across various domains including data requirements, aggregation, data lakes, data models, and data warehouses. Lead cross-functional teams in defining data strategies and adopting the latest technologies for effective data management. Define and enforce data architecture principles, standards, and best practices to ensure consistency, scalability, and security across all data assets. Recommend and tailor the best data modeling approaches based on client requirements and target architectures.
Posted 6 days ago
4.0 - 7.0 years
4 - 7 Lacs
gurgaon, haryana, india
On-site
Responsibilities: Design, develop, and maintain data integration processes using Oracle Data Integrator (ODI). Create and manage ODI mappings, packages, and scenarios to meet business requirements. Perform data extraction, transformation, and loading (ETL) processes to ensure data accuracy and consistency. Collaborate with cross-functional teams to gather and understand business requirements and translate them into technical specifications. Optimize and tune ODI processes for performance and scalability. Implement data quality checks and error handling mechanisms in ODI. Develop and maintain data warehouse (DWH) solutions, including data modeling, schema design, and ETL processes. Implement and manage multiple data load strategies (e.g., incremental loads, full loads) based on business requirements and data volumes. Write and optimize complex SQL queries for data extraction, transformation, and reporting. Provide technical support and troubleshooting for ODI processes and data warehouse solutions. Stay updated with the latest industry trends and best practices in data integration, SQL, and data warehousing. Qualifications: A/SA, Experience 4 to 8 years with at least 2 project lifecycles ( BE/BTech/MTech) Proven experience in designing and developing data integration processes using Oracle Data Integrator (ODI). Strong knowledge of SQL and experience in writing and optimizing complex SQL queries. Experience with data modeling, schema design, and data warehousing concepts. Experience with multiple data load strategies (e.g., incremental loads, full loads). Experience with scheduling tools and processes for data integration workflows. Familiarity with data warehousing best practices and ETL processes. Excellent problem-solving and analytical skills.
Posted 6 days ago
3.0 - 6.0 years
3 - 6 Lacs
hyderabad, telangana, india
On-site
Key Responsibilities: Analyze large datasets to extract actionable and meaningful insights. Develop and maintain dashboards and reports to track key metrics. Collaborate with cross-functional teams to support data-driven decision-making processes. Apply advanced analytical techniques to solve complex business challenges. Present findings and strategic recommendations clearly to stakeholders. Requirements: Minimum 2 years of experience in data analysis. Degree from a top-tier educational institution. Proven track record working with leading analytics firms or high-growth startups. Strong proficiency in analytical tools and programming languages such as SQL, Python, and R. Excellent problem-solving skills with great attention to detail. Strong communication skills to effectively present data insights.
Posted 6 days ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
The Data Analyst role at our organization is a pivotal position that involves transforming intricate challenges into actionable insights by harnessing the power of data. As a Data Analyst, you will work collaboratively with diverse teams to analyze extensive datasets, construct predictive models, and devise data-centric solutions that elevate strategic decision-making processes and propel organizational advancement. We seek an inquisitive, imaginative individual who is enthusiastic about utilizing statistical and machine learning methodologies to address real-world issues. This role demands a proactive approach, strong implementation capabilities, and the capacity to operate autonomously. In addition to executing the existing strategy, you will be expected to contribute to its evolution. We are in search of candidates who aspire to make contributions to the broader data architecture community and emerge as thought leaders in this field. Embracing a diversity of perspectives is a core value for us, and we are dedicated to assembling a team that mirrors the diversity of our global community. Your responsibilities will include collecting, processing, and analyzing substantial volumes of structured and unstructured data from various sources. Your insights will play a key role in guiding strategic decisions and enhancing technology operations and metrics. This will involve improving resource utilization, enhancing system reliability, and reducing technical debt. Close collaboration with business stakeholders will be essential to grasp their objectives, formulate analytical questions, and translate requirements into data-driven solutions. Engaging in exploratory data analysis (EDA) to reveal patterns, trends, and actionable insights will be a crucial aspect of your work. Furthermore, you will be tasked with visualizing and communicating findings through engaging reports, dashboards, presentations, and data storytelling techniques tailored to both technical and non-technical audiences. Working closely with engineering teams, you will deploy predictive models into production environments, ensuring their reliability, scalability, and performance. Continuous monitoring, evaluation, and enhancement of model performance will be part of your ongoing responsibilities, requiring you to implement improvements as needed. Staying abreast of the latest research, tools, and best practices in data science, machine learning, and artificial intelligence is integral to this role. Additionally, fostering a culture of experimentation, continuous learning, and innovation within the organization is a key expectation. To be successful in this role, you should possess at least 10 years of experience as a practitioner in data engineering or a related field, along with a Bachelor's or Master's degree in computer science, Statistics, Mathematics, Data Science, Engineering, or a related quantitative field. Strong programming skills in languages like Python or R, familiarity with SQL, and knowledge of distributed computing frameworks such as Spark and Hadoop are advantageous. Proficiency in data visualization tools like Matplotlib, PowerBI, and Tableau is required. A solid grasp of probability, statistics, hypothesis testing, and data modeling concepts is essential. Experience with cloud platforms (e.g., AWS, GCP, Azure) for data storage, processing, and model deployment is beneficial. Excellent communication and collaboration skills are a must, including the ability to explain complex technical concepts to diverse audiences. Strong attention to detail, analytical thinking, and problem-solving abilities are also key attributes for this role. Preferred qualifications include experience in large-scale data science projects or in industry domains like finance, healthcare, retail, or technology. Familiarity with MLOps practices, model versioning, monitoring tools, natural language processing (NLP), computer vision, or time-series analysis is advantageous. Contributions to open-source projects or publications in relevant conferences/journals are a plus. Developing and maintaining data pipelines, ETL processes, and implementing machine learning algorithms and statistical models aimed at analyzing technology metrics are additional preferred qualifications. Hands-on experience with machine learning libraries and frameworks such as scikit-learn, TensorFlow, Keras, PyTorch, or XGBoost is desirable. Key competencies for this role include analytical expertise, technical proficiency in machine learning, statistical modeling, and data engineering principles, effective communication skills, collaboration abilities, and a dedication to innovative problem-solving and experimentation with novel technologies.,
Posted 6 days ago
5.0 - 10.0 years
3 - 6 Lacs
chennai, tamil nadu, india
On-site
RESPONSIBILITIES: Works on problems of moderate to large scope where analysis of situations or data requires a review of a variety of factors. Exercises judgment within defined procedures and practices to determine and lead appropriate action. Builds productive internal/external working relationships. Assists with the analysis of medium to large-size client requirements and design of comprehensive technical solutions. Participates in the design and implementation of solutions. Gathers and analyzes requirements with business owners and organizes discussions and infrastructure requirement analysis to meet the defined business criteria. Assists with the design and development of the Zilliant Price Manager application and assists with necessary integrations. Assists in the discovery meetings to learn what data will be flowing between systems and the best way to organize and store that data in the project applications. Participates in meetings with Project Managers to determine client needs, develops customized solutions within the technology platform, creates estimates, timelines and development goals, and designs, codes, and implements the Zilliant Price Manager application. Work on a project team to build solutions on the Zilliant platform. Contribute to code design for the project and to ensure that best practices are followed. Maintain continuous engagement throughout the project lifecycle, from initiation to closure, with a focus on follow-through and timely delivery of objectives. Proactively ensuring the quality of the data being entered into the system and build tests to ensure the quality throughout the process. Documenting the project data model, assumptions and the business rules. Contribute to Agile ceremonies e.g. sprint planning, user story refinement, and retrospectives, as needed. Attend daily standups, as needed. Write test cases, as needed. Assists in Admin and train-the-trainer sessions, as needed. Assists the project team in developing the Zilliant Price Manager application according to scope, including necessary integrations. MINIMUM AND/OR PREFERRED QUALIFICATIONS: EDUCATION: Bachelor s degree in Computer Science, Software Engineering, Information Technology or related field. Relevant coursework or certifications in programming or software development. EXPERIENCE: 4+ years of Technical Consultant experience or in related field. Previous experience working in a Data Engineering role or completed studies in Data Engineering. Experience implementing pricing management software and integrations or acting as administrator of a pricing management site, or equivalent experience with a different pricing or SaaS solution. Experience developing technical solutions leveraging version control and adherence to life cycle management best practices for code promotion. Previous work experience with programming, coding and/or software development, particularly with Python, preferred. 2 to 6 years of database development experience, preferably with SAAS Implementation projects for external customers. Strong SQL development skills. Fluent in English. Understanding of pricing strategy and logic, pricing KPIs, product hierarchies, and pricing rules. Solid understanding of data integration methodologies including ETL, ELT, and APIs from a variety of sources. Solid understanding of data extraction, transformation, and loading. Strong understanding of data schemes and relationship data modeling. Hands on experience with Rest APIs and web services technologies. AWS experience in a multi-tenant environment including RedShift, Aurora (Postgres), S3 etc., preferred. Experience designing high performance, high availability, high volume data pipelines in enterprise or SaaS environments, preferred. Experience with SAP, Dynamics, and/or Oracle CRM, preferred. Experience with Mulesoft, Fivetran or related platforms, preferred. Experience designing and implementing pricing strategies, preferred. SKILL REQUIREMENTS Requirements Analysis : Capability to gather, understand, and implement project requirements effectively. Teamwork : Communication and collaboration skills for working with cross-functional teams and stakeholders. Adaptability : Willingness to learn and adapt to new technologies and tools. Programming Languages: Experience manipulating data in Python, preferred. Experience writing SQL Server objects and stored procedures. Software Development: Understanding of software development and coding principles. Version Control: Familiarity with version control systems, such as Git, and collaborative coding workflows. Documentation : Proficiency in documenting code and system architecture to facilitate maintenance and collaboration. Pricing Management : Experience implementing pricing management software on small to large-size projects or acting as administrator for a pricing management. Understanding of pricing strategy logic. Ability to define and design pricing KPIs. Integrations : Working knowledge of Systems Integration in an enterprise. Experience with Rest APIs and other webservice technologies to integrate with on-prem or SaaS solutions. Database Knowledge : Experience with data extraction, transformation, and loading. Understanding data schemas and relational data modeling. Communication : Clear and effective communication, both written and verbal, is essential for collaborating with team members and conveying findings to clients. Proficient in English
Posted 6 days ago
5.0 - 10.0 years
3 - 6 Lacs
delhi, india
On-site
RESPONSIBILITIES: Works on problems of moderate to large scope where analysis of situations or data requires a review of a variety of factors. Exercises judgment within defined procedures and practices to determine and lead appropriate action. Builds productive internal/external working relationships. Assists with the analysis of medium to large-size client requirements and design of comprehensive technical solutions. Participates in the design and implementation of solutions. Gathers and analyzes requirements with business owners and organizes discussions and infrastructure requirement analysis to meet the defined business criteria. Assists with the design and development of the Zilliant Price Manager application and assists with necessary integrations. Assists in the discovery meetings to learn what data will be flowing between systems and the best way to organize and store that data in the project applications. Participates in meetings with Project Managers to determine client needs, develops customized solutions within the technology platform, creates estimates, timelines and development goals, and designs, codes, and implements the Zilliant Price Manager application. Work on a project team to build solutions on the Zilliant platform. Contribute to code design for the project and to ensure that best practices are followed. Maintain continuous engagement throughout the project lifecycle, from initiation to closure, with a focus on follow-through and timely delivery of objectives. Proactively ensuring the quality of the data being entered into the system and build tests to ensure the quality throughout the process. Documenting the project data model, assumptions and the business rules. Contribute to Agile ceremonies e.g. sprint planning, user story refinement, and retrospectives, as needed. Attend daily standups, as needed. Write test cases, as needed. Assists in Admin and train-the-trainer sessions, as needed. Assists the project team in developing the Zilliant Price Manager application according to scope, including necessary integrations. MINIMUM AND/OR PREFERRED QUALIFICATIONS: EDUCATION: Bachelor s degree in Computer Science, Software Engineering, Information Technology or related field. Relevant coursework or certifications in programming or software development. EXPERIENCE: 4+ years of Technical Consultant experience or in related field. Previous experience working in a Data Engineering role or completed studies in Data Engineering. Experience implementing pricing management software and integrations or acting as administrator of a pricing management site, or equivalent experience with a different pricing or SaaS solution. Experience developing technical solutions leveraging version control and adherence to life cycle management best practices for code promotion. Previous work experience with programming, coding and/or software development, particularly with Python, preferred. 2 to 6 years of database development experience, preferably with SAAS Implementation projects for external customers. Strong SQL development skills. Fluent in English. Understanding of pricing strategy and logic, pricing KPIs, product hierarchies, and pricing rules. Solid understanding of data integration methodologies including ETL, ELT, and APIs from a variety of sources. Solid understanding of data extraction, transformation, and loading. Strong understanding of data schemes and relationship data modeling. Hands on experience with Rest APIs and web services technologies. AWS experience in a multi-tenant environment including RedShift, Aurora (Postgres), S3 etc., preferred. Experience designing high performance, high availability, high volume data pipelines in enterprise or SaaS environments, preferred. Experience with SAP, Dynamics, and/or Oracle CRM, preferred. Experience with Mulesoft, Fivetran or related platforms, preferred. Experience designing and implementing pricing strategies, preferred. SKILL REQUIREMENTS Requirements Analysis : Capability to gather, understand, and implement project requirements effectively. Teamwork : Communication and collaboration skills for working with cross-functional teams and stakeholders. Adaptability : Willingness to learn and adapt to new technologies and tools. Programming Languages: Experience manipulating data in Python, preferred. Experience writing SQL Server objects and stored procedures. Software Development: Understanding of software development and coding principles. Version Control: Familiarity with version control systems, such as Git, and collaborative coding workflows. Documentation : Proficiency in documenting code and system architecture to facilitate maintenance and collaboration. Pricing Management : Experience implementing pricing management software on small to large-size projects or acting as administrator for a pricing management. Understanding of pricing strategy logic. Ability to define and design pricing KPIs. Integrations : Working knowledge of Systems Integration in an enterprise. Experience with Rest APIs and other webservice technologies to integrate with on-prem or SaaS solutions. Database Knowledge : Experience with data extraction, transformation, and loading. Understanding data schemas and relational data modeling. Communication : Clear and effective communication, both written and verbal, is essential for collaborating with team members and conveying findings to clients. Proficient in English
Posted 6 days ago
5.0 - 10.0 years
3 - 6 Lacs
kolkata, west bengal, india
On-site
RESPONSIBILITIES: Works on problems of moderate to large scope where analysis of situations or data requires a review of a variety of factors. Exercises judgment within defined procedures and practices to determine and lead appropriate action. Builds productive internal/external working relationships. Assists with the analysis of medium to large-size client requirements and design of comprehensive technical solutions. Participates in the design and implementation of solutions. Gathers and analyzes requirements with business owners and organizes discussions and infrastructure requirement analysis to meet the defined business criteria. Assists with the design and development of the Zilliant Price Manager application and assists with necessary integrations. Assists in the discovery meetings to learn what data will be flowing between systems and the best way to organize and store that data in the project applications. Participates in meetings with Project Managers to determine client needs, develops customized solutions within the technology platform, creates estimates, timelines and development goals, and designs, codes, and implements the Zilliant Price Manager application. Work on a project team to build solutions on the Zilliant platform. Contribute to code design for the project and to ensure that best practices are followed. Maintain continuous engagement throughout the project lifecycle, from initiation to closure, with a focus on follow-through and timely delivery of objectives. Proactively ensuring the quality of the data being entered into the system and build tests to ensure the quality throughout the process. Documenting the project data model, assumptions and the business rules. Contribute to Agile ceremonies e.g. sprint planning, user story refinement, and retrospectives, as needed. Attend daily standups, as needed. Write test cases, as needed. Assists in Admin and train-the-trainer sessions, as needed. Assists the project team in developing the Zilliant Price Manager application according to scope, including necessary integrations. MINIMUM AND/OR PREFERRED QUALIFICATIONS: EDUCATION: Bachelor s degree in Computer Science, Software Engineering, Information Technology or related field. Relevant coursework or certifications in programming or software development. EXPERIENCE: 4+ years of Technical Consultant experience or in related field. Previous experience working in a Data Engineering role or completed studies in Data Engineering. Experience implementing pricing management software and integrations or acting as administrator of a pricing management site, or equivalent experience with a different pricing or SaaS solution. Experience developing technical solutions leveraging version control and adherence to life cycle management best practices for code promotion. Previous work experience with programming, coding and/or software development, particularly with Python, preferred. 2 to 6 years of database development experience, preferably with SAAS Implementation projects for external customers. Strong SQL development skills. Fluent in English. Understanding of pricing strategy and logic, pricing KPIs, product hierarchies, and pricing rules. Solid understanding of data integration methodologies including ETL, ELT, and APIs from a variety of sources. Solid understanding of data extraction, transformation, and loading. Strong understanding of data schemes and relationship data modeling. Hands on experience with Rest APIs and web services technologies. AWS experience in a multi-tenant environment including RedShift, Aurora (Postgres), S3 etc., preferred. Experience designing high performance, high availability, high volume data pipelines in enterprise or SaaS environments, preferred. Experience with SAP, Dynamics, and/or Oracle CRM, preferred. Experience with Mulesoft, Fivetran or related platforms, preferred. Experience designing and implementing pricing strategies, preferred. SKILL REQUIREMENTS Requirements Analysis : Capability to gather, understand, and implement project requirements effectively. Teamwork : Communication and collaboration skills for working with cross-functional teams and stakeholders. Adaptability : Willingness to learn and adapt to new technologies and tools. Programming Languages: Experience manipulating data in Python, preferred. Experience writing SQL Server objects and stored procedures. Software Development: Understanding of software development and coding principles. Version Control: Familiarity with version control systems, such as Git, and collaborative coding workflows. Documentation : Proficiency in documenting code and system architecture to facilitate maintenance and collaboration. Pricing Management : Experience implementing pricing management software on small to large-size projects or acting as administrator for a pricing management. Understanding of pricing strategy logic. Ability to define and design pricing KPIs. Integrations : Working knowledge of Systems Integration in an enterprise. Experience with Rest APIs and other webservice technologies to integrate with on-prem or SaaS solutions. Database Knowledge : Experience with data extraction, transformation, and loading. Understanding data schemas and relational data modeling. Communication : Clear and effective communication, both written and verbal, is essential for collaborating with team members and conveying findings to clients. Proficient in English
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Engineer at Barclays, you will play a crucial role in driving innovation and excellence by spearheading the evolution of the digital landscape. Your primary responsibility will be to harness cutting-edge technology to revolutionize digital offerings and ensure unparalleled customer experiences. You will be evaluated based on your proficiency in key critical skills such as experience with Modern Data Engineering Frameworks, developing and maintaining cloud-native applications, and job-specific skill sets. To excel in this role, you should possess the following qualifications and experience: - A Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - Proven experience in data engineering, with a focus on ETL processes and data pipeline development. - Strong expertise in AbInitio and experience with modern data engineering tools and frameworks (e.g., Apache Spark, Kafka, AWS/GCP/Azure). - Proficiency in programming languages such as Python, Java, or Scala. - Experience with cloud-based data solutions and big data technologies. - Knowledge of data protection and privacy-preserving technologies. - Ability to ensure data quality, integrity, and security across all data pipelines. - Experience in developing and validating a wide range of data pipelines, integrating CI/CD pipelines, and promoting best practices in code versioning, testing, and environment management. - Defining technology strategy, mentoring teams, overseeing vendor collaborations, managing cybersecurity and data governance risks. - Strong problem-solving, communication, and leadership skills. - Collaborating with data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. - Mentoring and guiding junior data engineers to foster a culture of continuous learning and improvement. - Staying up-to-date with the latest industry trends and technologies to drive innovation within the team. Desirable skillsets/good to have: - Knowledge of data warehousing concepts and technologies. - Prior experience in large-scale transformation programs. This role will be based in Pune. In this role, your purpose will be to design, develop, and execute testing strategies to validate software functionality, performance, and user experience. You will collaborate with cross-functional teams to identify and resolve defects, continuously improve testing processes and methodologies, and ensure software quality and reliability. Your key accountabilities will include: - Developing and implementing comprehensive test plans and strategies. - Creating and executing automated test scripts. - Analysing requirements, participating in design discussions, and contributing to the development of acceptance criteria. - Conducting root cause analysis for identified defects and working closely with developers to support defect resolution. - Promoting a culture of code quality and knowledge sharing through code reviews and collaboration. - Staying informed of industry technology trends and actively contributing to the organization's technology communities. As a Vice President, you are expected to contribute to or set strategy, drive requirements, and make recommendations for change. You will manage resources, budgets, and policies, deliver continuous improvements, and demonstrate leadership and accountability in managing risks and strengthening controls related to your team's work. All colleagues at Barclays are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset of Empower, Challenge, and Drive.,
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Snowflake Developer at our organization, you will be a valuable member of our data engineering team. Your primary responsibility will be to design, develop, and maintain data pipelines and ETL workflows on Snowflake. You will leverage your expertise in SQL to write complex queries for data transformation and optimization. Additionally, you will play a key role in building and optimizing data models and data warehousing solutions to support business intelligence and analytics. Proficiency in Python is essential as you will be developing scripts for data ingestion, transformation, and automation. Collaboration with business stakeholders, data analysts, and fellow engineers is crucial to understand data requirements effectively. You will also be responsible for implementing best practices for data security, governance, and compliance, while troubleshooting and resolving performance bottlenecks in ETL and Snowflake queries. Working with cloud technologies such as AWS/GCP for storage, compute, and integration will be part of your daily tasks. The ideal candidate for this role should have at least 6 years of IT experience with a strong background in SQL, ETL, and data warehousing. Hands-on experience in Snowflake architecture, query optimization, and performance tuning is a must. Proficiency in Python for scripting and automation is required, along with experience working with cloud platforms like AWS or GCP. A solid understanding of data modeling, star/snowflake schemas, and best practices is essential. Excellent analytical, problem-solving, and communication skills are key attributes for success in this role. Nice to have skills include experience in the US healthcare domain, familiarity with tools like Airflow, DBT, or Informatica for data orchestration, and cloud certifications (AWS/GCP or Snowflake) which will be considered an added advantage. In return, we offer you the opportunity to work on cutting-edge data engineering projects in a collaborative and innovative work culture. Competitive compensation and growth opportunities are also part of the package.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are required to join our team as a skilled Consultant/Developer specializing in Informatica Cloud solutions. The ideal candidate should have a minimum of 5 years" experience in IDMC, particularly in migrating from Informatica PowerCenter to cloud solutions. Your role will be crucial in our migration project, where you will need hands-on expertise in ETL processes and BI-oriented database modeling. Your responsibilities will include overseeing critical aspects of the ETL process, ensuring smooth data integration and application integration modules" operation. You will need to monitor, investigate, and resolve ETL incidents during Paris night, fix and enhance evolution post-automatic migration from PowerCenter to IDMC, participate in Scrum ceremonies, and analyze new requirements. Additionally, you will design and implement new ETL solutions using Informatica Cloud Data Integration, coordinate with the operations team to establish data transfer flows with partner organizations, and create and maintain technical documentation in Confluence. Key Requirements: - 5 to 7 years of experience in ETL development using Informatica IDMC and PowerCenter. - Proven experience in migration projects from Informatica PowerCenter to IDMC. - Strong problem-solving skills and the ability to work independently. - Excellent communication and collaboration skills. - Ability to actively participate in decision-making processes. Nice to Have: - Familiarity with Microsoft SQL Server and Sybase ASE. - Experience with Jira and Unix. The position is based onsite in Bengaluru, requiring availability for early shifts (6:30 AM to 3:30 PM IST). The contract duration is 12 months, and immediate joiners are preferred.,
Posted 6 days ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
The Firmwide Resiliency Office (FRO), part of the Office of the Chief Finance Officer, is responsible for designing the Firm's Resilience strategy. This includes Resiliency Planning, Testing, Exercising, Reporting, and Product and Concern Management. The team comprises technical product, data, and analytic roles that support business resiliency. FRO collaborates closely with senior leadership, Lines of Business, Functional Resiliency teams, and key functions such as Control Management, Risk Management & Compliance, and Audit to ensure the resiliency program aligns with the firm's risk-taking activities. Additionally, the team provides corporate governance, awareness, and training. As a Data Management Vice President within FRO, you will play a crucial role in supporting data strategies for the Firm's Resilience Program. You will work closely with all areas of FRO and key stakeholders across Lines-of-Business (LOBs) and Corporate Functions (CFs). This role requires an execution-oriented approach, exceptional data analytical skills, and full engagement with the overall program. A key responsibility will be implementing initiatives to enhance and automate resiliency data management frameworks. You will design and implement data strategies, facilitate data sharing, and establish data governance and controls using advanced data wrangling and business intelligence tools. Your expertise in SQL, Python, data transformation tools, and experience with AI/ML technologies will be essential in driving these initiatives forward. Design, develop, and maintain scalable data pipelines and ETL processes using Databricks and Python and write complex SQL queries for data extraction, transformation, and loading. Develop and optimize data models to support analytical needs and improve query performance and collaborate with data scientists and analysts to support advanced analytics and AI/ML initiatives. Automate data processes and reporting using scripting utilities like Python, R, etc. Perform in-depth data analysis to identify trends and anomalies, translating findings into reports and visualizations. Collaborate with stakeholders to understand data requirements and deliver data services promptly. Partner with data providers to design data-sharing solutions within a data mesh concept. Oversee data ingestion, storage, and analysis, and create rules for data sharing and maintain data quality and integrity through automated checks and testing. Monitor and analyze data systems to enhance performance and evaluate new technologies and partner with technology teams and data providers to address data-related issues and maintain projects. Contribute to the design and implementation of data governance frameworks and manage firm-wide resiliency data management frameworks, procedures, and training. Stay updated with emerging data technologies and best practices and develop and maintain documentation for data engineering processes. Lead and manage a team of data professionals, providing guidance, mentorship, and performance evaluations to ensure successful project delivery and professional growth. Required qualifications, skills and capabilities: - Bachelor's degree in computer science, Data Science, Statistics, or a related field, or equivalent experience. - Expert in SQL for data manipulation, querying, and optimization, with advanced database experience. - Proficient in scripting utilities like Python, R, etc.; including data analysis libraries such as Pandas and NumPy. - Proficient in data transformation tools like Alteryx, and Tableau, and experienced in working with APIs. - Direct experience with Databricks, Spark, and Delta Lake for data processing and analytics. - Experience in data reconciliation, data lineage, and familiarity with data management and reference data concepts. - Excellent analytical, problem-solving, and communication skills, with a collaborative and team-oriented mindset. - Solid knowledge of software architecture principles, cloud-native design (e.g., AWS, Azure, GCP), containerization (Docker, Kubernetes), and CI/CD best practices. - Self-starter with strong verbal, written, and listening skills, and excellent presentation abilities. - Proven influencing skills and the ability to be effective in a matrix environment. - Understanding of operational resilience or business continuity frameworks in regulated industries Preferred qualifications, skills and capabilities: - 10+ years of experience in data management roles such as Data Analyst or Data Engineer. - Strong understanding of data warehousing concepts and principles. - Skilled in handling large, complex datasets for advanced data analysis, data mining, and anomaly detection. - Experience with AI/ML technologies and frameworks,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You are a highly skilled and motivated Data Engineer who will be responsible for designing & developing data transformations and data models to ensure reliable and efficient data processing and analysis. You will work closely with cross-functional teams to support data-driven decision-making processes and contribute to the overall success of insights teams. Your key proficiency and responsibilities include expertise in DBT (Data Build Tool) for data transformation and modeling, proficiency in Snowflake, a strong understanding of data architecture, modeling, and data warehousing best practices, designing, developing, and maintaining robust data pipelines using DBT and Snowflake, implementing and optimizing data ingestion processes, collaborating with stakeholders to ensure data integrity and quality, performing data analysis and profiling, documenting data workflows, models, and ETL processes, staying updated with the latest trends in data engineering, DBT, and Snowflake. You should have proven experience as a Data Engineer with a focus on data ingestion and ETL processes, experience with ETL tools and technologies like Apache Airflow, Talend, Informatica, proficiency in SQL and programming languages such as Python or Java, familiarity with cloud platforms and services (e.g., AWS) with experience on AWS Lambda, adhering to development best practices, conducting code reviews, participating in scrum methodology, and effectively communicating with team members and stakeholders. You are expected to take ownership of assigned tasks and work independently to complete them.,
Posted 6 days ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
As a Snowflake ETL Expert at Fiserv, you will play a crucial role in designing, developing, and maintaining ETL processes to facilitate data integration and analytics using the Snowflake platform. Your responsibilities will involve close collaboration with data architects, analysts, and various stakeholders to ensure the accurate and efficient processing and storage of data. You will be tasked with designing and developing ETL processes, encompassing the extraction, transformation, and loading of data from diverse sources into Snowflake. Data integration will be a key aspect of your role, ensuring the quality and consistency of integrated data from multiple sources. Performance optimization of ETL processes to enhance efficiency and scalability will also be within your purview. In addition to ETL processes, you will be involved in data modeling activities to create and maintain data models that support analytical insights and reporting functionalities. Collaboration with data architects, analysts, and stakeholders will be essential to comprehend data requirements and deliver effective solutions. Documentation of ETL processes, data models, and other pertinent information will also be part of your responsibilities. To excel in this role, you should possess at least 7 years of overall IT experience, including a minimum of 3 years of experience in Snowflake platform Data Engineering. Proficiency in writing complex SQL queries, stored procedures, and analytical functions, as well as experience with Python scripting, is required. A strong understanding of data warehousing, ETL concepts, and best practices is essential. Familiarity with ETL tools like Informatica, Talend, or similar platforms is advantageous. Strong database concepts such as Entity-relationship, Data modeling, DDL, and DML statements are expected. While a Bachelor's degree is a prerequisite, relevant work experience can be considered as a substitute. Excellent analytical and problem-solving skills are crucial, along with effective communication abilities to collaborate with stakeholders and document processes. You should demonstrate the capacity to work both independently and as part of a team. Having additional qualifications such as Snowflake SnowPro certification, experience with code versioning tools like Github, proficiency in AWS/Azure Cloud services, exposure to an Agile development environment, and domain knowledge in Cards, banking, or financial services industry would be highly beneficial.,
Posted 6 days ago
0.0 - 3.0 years
0 Lacs
karnataka
On-site
As a Business Intelligence Analyst and Data Engineer at Hitachi Industrial Equipment Systems India, you will play a crucial role in developing and maintaining dashboards, reports, and data pipelines to support business operations and decision-making processes. Your responsibilities will include collaborating with stakeholders, analyzing datasets, and ensuring data accuracy across various tools and platforms. In the realm of Business Intelligence, you will be tasked with utilizing tools such as Power BI, Tableau, or Looker to create impactful visualizations that translate complex data into actionable insights. Working closely with business stakeholders, you will gather requirements and enhance reporting efficiency to drive informed decisions within the organization. In the realm of Data Engineering, you will be involved in building and maintaining ETL/ELT data pipelines using SQL, Python, ADF, and Snowflake. Your role will encompass data cleaning, transformation, and loading processes from multiple sources into data warehouses like ADLS and Snowflake. Additionally, you will contribute to data modeling efforts and document data sources and definitions to ensure data integrity and reporting accuracy. To excel in this role, you should possess strong SQL skills, familiarity with BI tools, and scripting languages such as Python or R. A solid understanding of data warehousing concepts, ETL processes, and cloud data platforms like AWS, Azure, or GCP will be beneficial. Furthermore, knowledge of Agile methodologies, data privacy best practices, and a Bachelor's degree in a relevant field are essential qualifications for this position. Your success in this role will be driven by your strong communication skills, attention to detail, and proactive attitude towards learning and taking initiative. Being a part of a team environment, you will demonstrate effective interpersonal skills, meet deadlines, and adapt to changing business conditions while upholding Hitachi's core values of Harmony, Trust, Respect, Sincerity, Fairness, Honesty, Integrity, and Pioneering Spirit.,
Posted 6 days ago
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
As a part of the Comcast team, you will be responsible for analyzing and evaluating operational performance metrics and resources to ensure alignment with the company's operational plans and strategic goals. Your role will involve creating and facilitating reporting and analysis to evaluate operational initiatives and drive efficiencies. You will develop standard reporting measures, provide quantitative explanations of relevant data, and continuously analyze alternatives and solutions. Your recommendations will be based on operational reporting and analysis to enhance customer experience and drive operational efficiencies. Additionally, you will provide financial reporting, revenue analysis support, discount monitoring, promotional modeling, and subscriber reporting issue resolution to various departments within the company. Drawing upon your in-depth experience and knowledge in this discipline, you will determine work priorities, prioritize tasks for others, act as a resource for colleagues at all levels, and make directional decisions accurately. Your expertise will be crucial in optimizing business processes and selecting the right tools for delivering business-level projects. Core Responsibilities: - Building and managing automated procurement analytics solutions. - Writing robust SQL queries and Python scripts for data extraction, transformation, and automation. - Consolidating, analyzing, and interpreting procurement-related data such as financial systems, purchase orders, invoices, and supplier data. - Supporting procurement strategy through market intelligence and benchmarking activities. - Collaborating closely with operational and engineering teams to deliver technological advancements. - Providing strategic support for quarterly analytics and procurement insight presentations. - Contributing to ad-hoc analytics projects and supporting global cross-functional initiatives. Requirements: - Strong expertise in the Microsoft Tech stack, especially Azure, Copilot Studio, and Power Automate. - Bachelor's or Master's degree in Business, Information Systems, Technology, Engineering, or equivalent. - 8 to 11 years of relevant experience, preferably within telecom, technology, or media industries, including leading smaller projects. - Proficiency in SQL, Python, and ETL processes. - Previous experience working with procurement-related data like contracts, purchase orders, invoices, and supplier data. - Exceptional quantitative and analytical problem-solving skills. - Ability to work effectively within global and matrix organizational structures. - Excellent verbal and written communication skills in English. - Availability to participate in meetings aligned with Eastern Standard Time (USA). Please note that the provided information details the general nature of the work expected from employees in this role and is not an exhaustive list of all duties, responsibilities, and qualifications. Education: Bachelor's Degree While a Bachelor's Degree is preferred, Comcast may also consider applicants with a combination of coursework and experience or extensive related professional experience. Relevant Work Experience: 7-10 Years,
Posted 6 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You will be responsible for utilizing your 2-4 years of experience in Azure Data Factory, Power BI, and related technologies. Your expertise in SQL and data modeling will be crucial for this role. Additionally, you will leverage your experience with ETL processes, data warehousing, and business intelligence to contribute effectively. Your strong communication skills will be essential as you collaborate with team members and stakeholders. It is important to have familiarity with cloud platforms, such as Azure, and knowledge of data integration best practices to excel in this position.,
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
maharashtra
On-site
You are a skilled SQL Developer with working knowledge of Database Administration, responsible for developing and optimizing SQL-based solutions while ensuring the stability, performance, and security of enterprise database systems. Your role involves writing complex queries and supporting essential database operations in both on-premises and cloud environments. The CRM India team offers a fast-paced and dynamic work environment where you will collaborate with engineers to provide software services to patients with Cardiac conditions and physicians treating such patients. Working closely with teams in Architecture, Development, Product, Infrastructure, and DevOps, you will support ongoing and prospective product features. The team is dedicated to building products that enhance the quality of life for patients, fostering a culture of learning, collaboration, and growth. Your key responsibilities include developing, optimizing, and maintaining complex SQL queries, stored procedures, views, and functions. You will design and maintain secure, scalable, and efficient data structures, analyze and validate database designs, support ETL processes and data integration, monitor and tune database performance, and assist in backup and recovery strategies. Additionally, you will participate in database deployments, upgrades, troubleshoot and resolve database-related issues, enforce data access controls, and collaborate with various teams to ensure efficient data access and system stability. Contribution to the development of database standards, documentation, and best practices is also expected, working in Oracle, PostgreSQL, and SQL Server environments. Preferred qualifications for this role include a Bachelor's degree in computer engineering, Computer Science, or a related field, along with 5-7 years of experience in SQL development with exposure to database operations. Proficiency in Oracle, PostgreSQL, and SQL Server database services is required, as well as a strong command of SQL, indexing, query optimization, and performance tuning. Experience with database monitoring, diagnostics, capacity planning, knowledge of database design, maintenance, and storage in Azure, understanding of SDLC, DevOps practices, strong analytical and communication skills, and the ability to manage multiple priorities in a cross-functional team environment.,
Posted 6 days ago
4.0 - 8.0 years
0 Lacs
uttar pradesh
On-site
We are looking for an experienced Power BI Engineer to join our team and contribute to designing, developing, and maintaining business intelligence solutions. In this role, you will be responsible for gathering requirements, transforming raw data into meaningful insights, building interactive dashboards, and ensuring data accuracy and performance optimization. Your responsibilities will include working closely with business stakeholders to gather reporting requirements and key performance indicators (KPIs). You will design, develop, and deploy interactive Power BI dashboards and reports, as well as build data models using Power Query, DAX, and relational databases. It will be essential to optimize the performance of reports and datasets for large-scale usage and integrate Power BI with various data sources such as SQL Server, Excel, APIs, and cloud services. Additionally, you will implement and manage row-level security (RLS) and data governance policies, collaborate with data engineers and business teams to ensure data accuracy, quality, and security, provide end-user training and support for Power BI adoption, and stay updated with the latest Power BI and Microsoft Data Platform features. To succeed in this role, you should have a Bachelor's degree in Computer Science, Information Technology, Data Analytics, or a related field, along with at least 4 years of experience working with Power BI. You must possess strong expertise in DAX, Power Query (M language), and data modeling, as well as hands-on experience with SQL and relational databases. Proficiency in creating measures, KPIs, and drill-through dashboards, strong knowledge of ETL processes and data warehouse concepts, and the ability to analyze and interpret large datasets to provide business insights are crucial. Excellent communication and stakeholder management skills are also required. Additionally, the following qualifications would be preferred or considered advantageous: - Experience with Azure Data services (Azure SQL, Synapse, Data Factory, Dataverse). - Knowledge of Power Platform (Power Apps, Power Automate) for extended use cases. - Experience in row-level security, data security, and governance frameworks. - Microsoft Certification: PL-300 (Power BI Data Analyst Associate). - Familiarity with Python or R for advanced analytics. If you meet the requirements and are passionate about leveraging Power BI to drive business intelligence initiatives, we encourage you to apply and be a part of our dynamic team.,
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |