Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Company Profile: Job Title: SAP Materials Management Position: SSE Experience:7+Years Category: Software Development/ Engineering Main location: Hyderabad Position ID: J0425-1808 Employment Type: Full Time Job Description : SAP MM SAP Materials Management Lead and manage a team of SAP specialists focused on material; structure; and data operations. Oversee the creation and maintenance of material master data and structures in SAP. Ensure data integrity and accuracy across all SAP material and structural configurations. Monitor and analyze data quality metrics; identifying and addressing any issues. Provide technical guidance and support to the team on SAP material management and data operations. Develop and maintain documentation for material; structure; and data operations processes and procedures. Position Description Behavioural Competencies : Proven experience of delivering process efficiencies and improvements Clear and fluent English (both verbal and written) Ability to build and maintain efficient working relationships with remote teams Demonstrate ability to take ownership of and accountability for relevant products and services Ability to plan, prioritise and complete your own work, whilst remaining a team player Willingness to engage with and work in other technologies Note: This job description is a general outline of the responsibilities and qualifications typically associated with the SAP Materials Management role. Actual duties and qualifications may vary based on the specific needs of the organization. Your future duties and responsibilities Required qualifications to be successful in this role Skills: Analytical Thinking .
Posted 3 days ago
5.0 - 10.0 years
25 - 30 Lacs
Hyderabad
Work from Office
5+ years of professional experience in Data Quality 5+ years of experience in Trillium Must have the professional experience in Python coding Must be good at SQL. Trillium migration is desired. Skills: Data Engineering Data Quality Management Python SQL
Posted 3 days ago
2.0 - 4.0 years
8 - 12 Lacs
Bengaluru
Work from Office
OPENTEXT - THE INFORMATION COMPANY OpenText is a global leader in information management, where innovation, creativity, and collaboration are the key components of our corporate culture. As a member of our team, you will have the opportunity to partner with the most highly regarded companies in the world, tackle complex issues, and contribute to projects that shape the future of digital transformation. AI-First. Future-Driven. Human-Centered. At OpenText, AI is at the heart of everything we do powering innovation, transforming work, and empowering digital knowledge workers. Were hiring talent that AI cant replace to help us shape the future of information management. Join us. Your Impact As a Python Developer in the Debricked data science team, you will work on enhancing data intake processes and optimizing data pipelines. You will apply many different approaches, depending on the needs of the product and the challenges you encounter. In some cases, we use AI/LLM techniques, and we expect the number of such cases to increase. Your contributions will directly impact Debricked s scope and quality and will help ensure future commercial growth of the product. What the role offers As a Python Developer, you will Innovative Data Solutions Develop and optimize data pipelines that improve the efficiency, accuracy, and automation of the Debricked SCA tool s data intake processes. Collaborative Environment Work closely with engineers and product managers from Sweden and India to create impactful, data-driven solutions. Continuous Improvement Play an essential role in maintaining and improving the data quality that powers Debricked s analysis, improving the product s competitiveness. Skill Development Collaborate across teams and leverage OpenText s resources (including an educational budget) to develop your expertise in software engineering,data science and AI, expanding your skill set in both traditional and cutting-edge technologies. What you need to Succeed 2-4 years of experience in Python development, with a focus on optimizing data processes and improving data quality. Proficiency in Python and related tools and libraries like Jupyter, Pandas and Numpy. A degree in Computer Science or a related discipline. An interest in application security. Asset to have skills in Go, Java, LLMs (specifically Gemini), GCP, Kubernetes, MySQL, Elastic, Neo4J. A strong understanding of how to manage and improve data quality in automated systems and pipelines. Ability to address complex data challenges and develop solutions to optimize systems. Comfortable working in a distributed team, collaborating across different time zones. One last thing OpenText is more than just a corporation, its a global community where trust is foundational, the bar is raised, and outcomes are owned.Join us on our mission to drive positive change through privacy, technology, and collaboration. At OpenText, we dont just have a culture; we have character. Choose us because you want to be part of a company that embraces innovation and empowers its employees to make a difference. OpenTexts efforts to build an inclusive work environment go beyond simply complying with applicable laws. Our Employment Equity and Diversity Policy provides direction on maintaining a working environment that is inclusive of everyone, regardless of culture, national origin, race, color, gender, gender identification, sexual orientation, family status, age, veteran status, disability, religion, or other basis protected by applicable laws. . Our proactive approach fosters collaboration, innovation, and personal growth, enriching OpenTexts vibrant workplace.
Posted 3 days ago
0.0 - 1.0 years
0 Lacs
Tiruchirapalli
Work from Office
About the Company: Who we are: We are the manufacturing brains behind successful companies. What we do: Frigate is an On-demand cloud manufacturing startup that helps OEMS/ ODMS and product/device companies identify right manufacturing vendors and leverage their existing capacities to get their products manufactured. Responsibilities Collect data from primary and secondary sources Maintain and update databases and data systems Clean, validate, and correct data to ensure accuracy and completeness Analyze large data sets to identify trends, patterns, and insights Use statistical techniques to interpret data and generate reports Create data visualizations, dashboards, and presentations for stakeholders Collaborate with business teams to understand data needs and requirements Provide actionable insights to support decision-making processes Automate data collection and reporting processes where possible Use tools such as SQL, Excel, Python, R, Tableau, Power BI, etc. improve data quality and support decision-making. Perform additional tasks as required.
Posted 3 days ago
4.0 - 8.0 years
25 - 30 Lacs
Kolkata
Work from Office
Join our Team About this opportunity We are now looking for a Senior Data Scientist to be responsible for developing AI/ML methods, processes, and systems to extract knowledge or insights to drive the future of artificial intelligence!! Provide data science tasks, perform advanced statistical analysis, and create insights into data to provide to the business actionable insights, identify trends, and measure performance which address business problems. Collaborate with business and process owners to understand business issues, and with engineers to implement and deploy scalable solutions, where applicable!! What you will do Apply and/or develop statistical modeling techniques (such as deep neural networks, Bayesian models, Generative AI, Forecasting), optimization methods and other ML techniques. Synthesize problems into data question(s). Convert data into practical insights. Analyze and investigate data quality for identified data and communicate it Product Owner, Business Analyst, and other relevant stakeholders. Collect Data, explore it, and perform analysis to extract information suitable to the business need. Identify gaps in the data, aggregate data as per business need. Design & perform Data Analysis, Data Validation, Data Transformation, Feature Extraction. Decide approach for addressing business needs with Data & analytics. Understand end user needs and work accordingly with identifying new features in the data. Develop Data Science and Engineering Infrastructure &Tools. Derive key metrics suitable for the use-case and present the analysis to key stakeholder. You will bring 4-8 years of relevant Industry experience. A Bachelor s or higher degree in Computer Science, Statistics, Mathematics, or related disciplines. Ability to analyse data and communicate outcome to key stakeholders exploring new data source Excellent coding skills in python, R, SQL etc. Understanding of cloud services. Evidence of academic training in Statistics. Deep/broad knowledge of machine learning, statistics, optimization, or related field A genuine curiosity about new and applied technology and software engineering coupled with a high degree of business understanding. Experience in large scale product development is a plus. Experience in Generative AI and Large Language Models Why join Ericsson? What happens once you apply? Primary country and city: India (IN) || Kolkata Req ID: 768998
Posted 3 days ago
5.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Join our Team About this opportunity We are now looking for a Senior Data Scientist to be responsible for developing AI/ML methods, processes, and systems to extract knowledge or insights to drive the future of artificial intelligence!! Provide data science tasks, perform advanced statistical analysis, and create insights into data to provide to the business actionable insights, identify trends, and measure performance which address business problems. Collaborate with business and process owners to understand business issues, and with engineers to implement and deploy scalable solutions, where applicable!! What you will do Apply and/or develop statistical modeling techniques (such as deep neural networks, Bayesian models, Generative AI, Forecasting), optimization methods and other ML techniques. Synthesize problems into data question(s). Convert data into practical insights. Analyze and investigate data quality for identified data and communicate it Product Owner, Business Analyst, and other relevant stakeholders. Collect Data, explore it, and perform analysis to extract information suitable to the business need. Identify gaps in the data, aggregate data as per business need. Design & perform Data Analysis, Data Validation, Data Transformation, Feature Extraction. Decide approach for addressing business needs with Data & analytics. Understand end user needs and work accordingly with identifying new features in the data. Develop Data Science and Engineering Infrastructure &Tools. Derive key metrics suitable for the use-case and present the analysis to key stakeholder. You will bring 5-10 years of relevant Industry experience. A Bachelor s or higher degree in Computer Science, Statistics, Mathematics, or related disciplines. Ability to analyse data and communicate outcome to key stakeholders exploring new data source Excellent coding skills in python, R, SQL etc. Understanding of cloud services. Evidence of academic training in Statistics. Deep/broad knowledge of machine learning, statistics, optimization, or related field A genuine curiosity about new and applied technology and software engineering coupled with a high degree of business understanding. Experience in large scale product development is a plus. Experience in Generative AI and Large Language Models
Posted 3 days ago
5.0 - 10.0 years
3 - 4 Lacs
Bengaluru
Work from Office
About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Title: AWS QA Engineer Location: Bangalore /Pune Experience: 5 - 10 Years "We are seeking a Senior QA Data Engineer with expertise in Python, Pyspark, SQL and data testing automation. Good to have experience in AWS Cloud, Jenkins, AWS Glue, Lambda and S3. The ideal candidate will ensure the quality and reliability of data pipelines through robust testing and collaboration with DevOps teams on CI/CD pipelines. " Key Responsibilities: Develop and maintain automated test cases for ETL pipelines, and data workflows using Python/Pyspark. Proficient in SQL and NoSQL databases for data validation and analysis. Validate Jenkins CI/CD pipelines for automated testing and deployment. Willingness to scale up on testing AWS services, including Glue, Lambda, RDS, S3, and Iceberg, ensuring seamless integration and scalability. Collaborate closely with DevOps to enhance deployment processes and pipeline efficiency. Design data validation frameworks and monitor pipeline performance for data quality assurance. Validate NoSQL and relational database integrations in data workflows. Document test strategies, results, and best practices for cross-team alignment. Preferred: AWS certifications (e.g., AWS Developer Associate). Experience with tools like Airflow or Spark. If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.
Posted 3 days ago
4.0 - 7.0 years
8 - 12 Lacs
Kolkata
Work from Office
. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary Responsibilities Development experience on OAS, OAC(DVCS and BICS), OBIA or FAW knowledge will be added advantage Experience on lift & shift of OBIEE to OAC Should have excellent debugging and troubleshooting skills. Should have experience in Metadata management (RPD) and Analytics Should have good knowledge on OAC/OBIEE security Experience in customization and configuration of OBIA (preferably with Fusion Saas Cloud), OBIEE, Dashboards, Administration Experience in interacting with the Business Users to analyze the business process and gathering requirements Experience in sourcing data from Oracle EBS Experience in basic admin activities of OAC and OAS in Unix and Windows environments, like server restarting etc. Experience in Configuration, Troubleshooting, Tuning of OAC reports Mandatory skill sets Metadata management (RPD), design and OBIEE Admin experience including deployment of RPD, Catalog manager & Security Experience in OBIEE Dashboard and Reports Designer and Developer Experience in basic admin activities in Unix and Windows environments, like server restarting etc. Experience in Configuration, Troubleshooting, Tuning of OBIEE 12C Preferred skill sets OAC + OBIEE Education qualification B.tech/MBA/MCA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred Required Skills Oracle Database Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No
Posted 3 days ago
4.0 - 7.0 years
6 - 10 Lacs
Kolkata
Work from Office
In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC Learn more about us . s Experience of 4 to 7 years who has adequate knowledge Scalas objectoriented programming. Scala code written in the backend is the basis of the finance module reports which are accessed via QuickSight. To assess scala code written for Finance module reports, figure out the issues and fix the same. Mandatory skill sets Scala and OOP Preferred skill sets Scala and OOP Education qualification B.tech/MBA/MCA Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills Scala (Programming Language) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No
Posted 3 days ago
3.0 - 6.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Challenging. Meaningful. Life-changing. Those aren t words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You ll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Read more: careers. bms. com/working-with-us . Responsibilities will include, but are not limited to: Study Startup: Draft EDC build timeline in collaboration with Data Management Lead. Perform DB build tasks by creating specifications for Database and Edit Checks. Create test scripts and complete test data entry/UAT for Coding, Site Payment, Safety Gateway. Collaborate with Data Management Lead and facilitate startup meetings which includes, not limited to, EDC build kick-off, Interactive eCRF Build and IRMs (Interactive Review Meeting) for database and Edit checks. Create and finalize study documents like Data Quality Management Plan, eCRF completion Instructions, Protocol Data Review Plan (PDRP) post study team review. Ensure all startup documents are completed as per SOP and filled in eTMF as per eTMF master plan. Study Conduct: Plan/execute Post Production/Migration for the study (if any). Coordinate with Clinical Data Managers for the execution of data review tasks. Coordidate with external data vendors for any escalations related to any vendor data. Support Clean Patient Group delivery along with Clinical Data Management staff. Update study documents as needed during the conduct of the study Support DML to coduct Data Quality Review meetings. Provide Data Health Metrics to Data Management Lead as requested. Study Closeout- Support Data Management Lead in planning and execution of database lock activities. Perform post lock activities, as needed. Project Management Support DML in project management tasks to make sure that study is delivered successfully as per the study timelines with quality. Documentation: Filing of appropriate documents in eTMF as per eTMF master plan. Training and Mentorship: Provide Training and mentoring to junior CDM staff. Bachelor s Degree required. Life sciences, Pharmacy or relevant fields preferred. 6 years of experience in managing end to end Clinical Data Management tasks. Able to work on end to end Clinical Data Management tasks Able to work collaboratively on multi-disciplinary project teams. Strong knowledge of Clinical Drug Development Process, FDA/ICH guidelines and industry standard practices regarding data management Strong knowledge and experience of EDC systems (Medidata RAVE preferred); demonstrated knowledge of Microsoft Office skills. Strong oral and written communication skills. Strong project management skills Yes, 5-10% Industry Conferences, Investigator Meetings, Regulatory Inspections (as needed) If you come across a role that intrigues you but doesn t perfectly line up with your resume, we encourage you to apply anyway. You could be one step away from work that will transform your life and career. With a single vision as inspiring as Transforming patients lives through science , every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role: Site-essential roles require 100% of shifts onsite at your assigned facility. Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility. For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture. For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function. BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles. Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer. If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms. com . Visit careers. bms. com/ eeo -accessibility to access our complete Equal Employment Opportunity statement. BMS cares about your well-being and the well-being of our staff, customers, patients, and communities. As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters. BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area. If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information: https://careers. bms. com/california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations.
Posted 3 days ago
5.0 - 8.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Do you have a passion for assessing software compliance and guiding stakeholders to make in-formed sourcing decisionsInterested in implementing a Software Asset Management tool to optimize software spend and manage risks with a comprehensive view of entitlements and usageAre you driven to work in a complex, global environment where ideas are valued, and efforts are appreciated We re looking for a Software License Manager to: establishing and maintaining Effective License Positions (ELP) for strategic vendor software products processing (reading and interpreting) software contracts and other commercial documents (purchase orders, invoices, quotes) to validate and ensure accurate ELP and correct interpretation of license terms reviewing major software vendor product ELP with key business partners to ensure license compliance and optimal use of software reporting and escalating identified risk or potential underutilization supporting software contract renewal process or software audits with complete and accurate information and commentary supporting, maintaining, and improving UBS/Credit Suisse Software License Management tools, driving automation, validating, and improving data quality of source inventory systems, adopting new technologies (e.g Cloud, Containerization, new license model) Global Software License Management team consists currently of 21 team members. 6 located in Poland, 3 in Switzerland and 12 in India. Team is a combination of licensing experts with multiple years of experience and individuals who started their software licensing career few months ago. You ll be part of the GCTO GSAM team at our office in Hyderabad. Our team is responsible for reviewing and assessing the Banks software assets, maintaining compliance with software li-cense and maintenance contracts, and onboarding commercial documents in our SAM tool to maintain the Banks Software inventory. We also support sourcing teams with input for contract negotiations by providing current license positions and input on license-specific terms and conditions. in-depth knowledge of the SAM market, SAM operations, and competencies, with the ability to advise on software licensing topics, audits and produce Effective Licensing Positions (ELPs) for software publishers minimum 10+ years of experience in Software Asset Management or License Management in a global organization, CSAM or similar certification is a plus practical knowledge and Software License Management experience of product portfolio and licensing of at least one of the key vendors, i.e. Microsoft, IBM (PVU/RVU metrics & including ILMT bundling), Oracle - database licensing, Broadcom, Cloudera, Red Hat, BMC, CA technologies, SAP good knowledge of Flexera FNMS or/and ServiceNow SAMPro will be an added advantage general understanding of IT software systems, Client and Server virtualization technologies, Cloud / SaaS / PaaS solution, infrastructure, and software procurement processes a results-oriented individual with a high work ethic, accountability, and excellent problem-solving skills, who also possesses strong organizational and communication abilities to inter-act with managers, staff, and senior stakeholders. Dedicated to fostering an inclusive culture and valuing diverse perspectives bachelor s degree in computer science, Information Systems, Business Administration or other related field, or equivalent
Posted 3 days ago
4.0 - 7.0 years
15 - 17 Lacs
Pune
Work from Office
Do you want to build your software engineering skills whilst creating really impactful applicationsAre you interested in being part of an externally recognized engineering community with personal development at its coreWould you like to join a team that gives back to the community and engages with a diverse group of people We re looking for software engineers to: Design and maintain data pipelines using Starburst and related technologies. Optimize query performance and resolve data processing bottlenecks. Manage databases to ensure high availability reliability and security. Integrate Starburst with various data sources including cloud services and APIs. Monitor data pipelines and troubleshoot issues proactively. Collaborate with data scientists analysts and stakeholders on data requirements. Maintain comprehensive and up-to-date documentation for data processes. Stay current with data engineering advancements and propose innovative solutions. Implement best practices for data quality assurance and testing. Demonstrate experience with Python and data access (Numpy, Scipy, panda etc.), machine learning (Tensorflow etc.), and AI libraries (Chat GTP etc.) Have experience working with distributed systems, clustering, and replication technologies. Through the engineering guilds you ll become engaged in sharing and discussing knowledge with your peers and with staff networks you ll get involved in a wide range of volunteering and cultural topics. Once you ve met your team and joined our certified engineers development program, you ll also be able to become engaged in sharing and discussing knowledge with your peers through our engineering guilds and take part in a wide range of volunteering and cultural events and topics through our internal networks. software engineer/developer focused on cloud-based data virtualization and data delivery technologies minimum 2 Years experience working with Starburst develop, optimize, and maintain Starburst Enterprise queries for data processing and analytics. integrate Starburst with various data sources (e.g., cloud storage, relational databases, data lakes). solid understanding of Spark, Trino(Presto) or related technologies. able to code in Python within distributed computing systems such as databricks is added advantage(knowledge of PySpark) capable of working in a collaborative, multi-site environment to support rapid development and delivery of results and capabilities (i.e. AGILE SDLC) 3+ years of hands-on experience in developing large scale applications using data virtualization and/or data streaming technologies. UBS is the world s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors.. We have a presence in all major financial centers in more than 50 countries.
Posted 3 days ago
1.0 - 8.0 years
6 - 7 Lacs
Chennai
Work from Office
About the Role: Seeking a highly skilled Data Analyst with 6 years of experience to join our dynamic team. The ideal candidate will have a sound understanding of insurance products, processes such as underwriting, claims, policy administration and be skilled in transforming complex data into technical transformations. Requirements: Expertise in writing complex SQL queries to extract and manipulate data from relational databases (e.g., Oracle, SQL Server, Snowflake). Should analyse policy, claims, billing and underwriting data to identify trends, anomalies and opportunities. Proficient in translating raw data into business and technical transformations. Familiarity in document business rules, data mapping and transformation logic to support data pipeline development and data quality initiatives. Sound knowledge of P&C insurance domains such as policy administration, claims processing, underwriting, etc is a plus. Experience in SQL and relational database systems. #LI-MP1 #Hybrid
Posted 3 days ago
7.0 - 8.0 years
15 - 16 Lacs
Hyderabad
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: Design, develop, and maintain scalable ETL/ELT pipelines using PySpark and Python. Build and manage real-time data ingestion and streaming pipelines using Apache Kafka. Develop and optimize data workflows and batch processes on GCP using services like BigQuery, Dataflow, Pub/Sub, and Cloud Composer. Implement data quality checks, error handling, and monitoring across pipelines. Collaborate with data scientists, analysts, and business teams to translate requirements into technical solutions. Ensure best practices in code quality, pipeline reliability, and data governance. Maintain thorough documentation of processes, tools, and infrastructure. Requirements To be successful in this role, you should meet the following requirements: 6+ years of experience in data engineering roles. Strong programming skills in Python and PySpark. Solid experience in working with Kafka for real-time data processing. Proven hands-on experience with GCP data tools and architecture. Familiarity with CI/CD, version control (Git), and workflow orchestration tools (Airflow/Composer). Strong analytical and problem-solving skills with attention to detail. Excellent communication and team collaboration skills You ll achieve more when you join HSBC. .
Posted 3 days ago
4.0 - 5.0 years
9 - 10 Lacs
Bengaluru
Work from Office
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Snowflake Professionals in the following areas : JD for Senior Snowflake developer as below Snowflake SnowSQL, PL/SQL Any ETL Tool Job Description: 4-5 years of IT experience in Analysis, Design, Development and unit testing of Data warehousing applications using industry accepted methodologies and procedures. Write complex SQL queries to implement ETL (Extract, Transform, Load) processes and for Business Intelligence reporting. Strong experience with Snowpipe execustion, snowflake Datawarehouse, deep understanding of snowflake architecture and Processing, Creating and managing automated data pipelines for both batch and streaming data using DBT. Designing and implementing data models and schemas to support data warehousing and analytics within Snowflake. Writing and optimizing SQL queries for efficient data retrieval and analysis. Deliver robust solutions through Query optimization ensuring Data Quality. Should have experience in writing Functions and Stored Procedures. Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modelling Analyse & translate functional specifications /user stories into technical specifications. Good to have experience in Design/ Development in any ETL tool. Good interpersonal skills, experience in handling communication and interactions between different teams. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 3 days ago
8.0 - 12.0 years
20 - 25 Lacs
Pune
Work from Office
Embark on a transformative journey as Test Automation Engineering Lead at Barclays, where youll spearhead the evolution of our digital landscape, driving innovation and excellence. Youll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. To be successful as a Test Automation Engineering Lead , you should have experience with: Bachelor s/Master s degree in Computer Science, Information Technology, or related field. Proven experience in test automation for data engineering platforms (ETL, data lakes, data warehouses, Big Data, etc. ). Hands-on expertise with automation tools such as Selenium, PyTest, Robot Framework, Apache Airflow, dbt, or similar. Strong programming/scripting skills in Python, Java, or Scala. Deep understanding of data quality, data validation, and data governance principles in the banking sector. Experience with CI/CD pipelines, DevOps practices, and cloud platforms (AWS, Azure, or GCP). Excellent communication, stakeholder management, and team leadership skills. Knowledge of regulatory and security requirements in the banking domain. Some other highly valued skills include: Experience with AI/ML model testing and validation. Certifications in Test Automation, Data Engineering, or Cloud Technologies. Prior experience in large-scale test transformation programs You may be assessed on the key critical skills relevant for success in role, such as experience with Test Transformation Leadership, Automation Strategy and Frameworks as well as job-specific skillsets. This role will be based out of Pune office. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. . Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations, and actively contribute to the organizations technology communities to foster a culture of technical excellence and growth. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures. . If managing a team, they define jobs and responsibilities, planning for the department s future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements. . If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. . OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions. . Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes.
Posted 3 days ago
5.0 - 7.0 years
15 - 16 Lacs
Pune
Work from Office
Job Summary: Cummins is seeking a skilled Data Engineer to support the development, maintenance, and optimization of our enterprise data and analytics platform. This role involves hands-on experience in software development , ETL processes , and data warehousing , with strong exposure to tools like Snowflake , OBIEE , and Power BI . The engineer will collaborate with cross-functional teams, transforming data into actionable insights that enable business agility and scale. Please Note: While the role is categorized as remote, it will follow a hybrid work model based out of our Pune office . Key Responsibilities: Design, develop, and maintain ETL pipelines using Snowflake and related data transformation tools. Build and automate data integration workflows that extract, transform, and load data from various sources including Oracle EBS and other enterprise systems. Analyze, monitor, and troubleshoot data quality and integrity issues using standardized tools and methods. Develop and maintain dashboards and reports using OBIEE , Power BI , and other visualization tools for business stakeholders. Work with IT and Business teams to gather reporting requirements and translate them into scalable technical solutions. Participate in data modeling and storage architecture using star and snowflake schema designs. Contribute to the implementation of data governance , metadata management , and access control mechanisms . Maintain documentation for solutions and participate in testing and validation activities. Support migration and replication of data using tools such as Qlik Replicate and contribute to cloud-based data architecture . Apply agile and DevOps methodologies to continuously improve data delivery and quality assurance processes. Why Join Cummins Opportunity to work with a global leader in power solutions and digital transformation. Be part of a collaborative and inclusive team culture. Access to cutting-edge data platforms and tools. Exposure to enterprise-scale data challenges and finance domain expertise . Drive impact through data innovation and process improvement . Competencies: Data Extraction Transformation - Ability to perform ETL activities from varied sources with high data accuracy. Programming - Capable of writing and testing efficient code using industry standards and version control systems. Data Quality Management - Detect and correct data issues for better decision-making. Solution Documentation - Clearly document processes, models, and code for reuse and collaboration. Solution Validation - Test and validate changes or solutions based on customer requirements. Problem Solving - Address technical challenges systematically to ensure effective resolution and prevention. Customer Focus - Understand business requirements and deliver user-centric data solutions. Communication Collaboration - Work effectively across teams to meet shared goals. Values Differences - Promote inclusion by valuing diverse perspectives and backgrounds. Education, Licenses, Certifications: Bachelor s or Master s degree in Computer Science, Information Systems, Data Engineering, or a related technical discipline. Certifications in data engineering or relevant tools (Snowflake, Power BI, etc. ) are a plus. Experience Must have skills: 5-7 years of experience in data engineering or software development , preferably within a finance or enterprise IT environment. Proficient in ETL tools , SQL , and data warehouse development . Proficient in Snowflake , Power BI , and OBIEE reporting platforms. Must have worked in implementation using these tools and technologies. Strong understanding of data warehousing principles , including schema design (star/snowflake), ER modeling, and relational databases. Working knowledge of Oracle databases and Oracle EBS structures. Preferred Skills: Experience with Qlik Replicate , data replication , or data migration tools. Familiarity with data governance , data quality frameworks , and metadata management . Exposure to cloud-based architectures, Big Data platforms (e. g. , Spark, Hive, Kafka), and distributed storage systems (e. g. , HBase, MongoDB). Understanding of agile methodologies (Scrum, Kanban) and DevOps practices for continuous delivery and improvement.
Posted 3 days ago
5.0 - 8.0 years
17 - 25 Lacs
Bhubaneswar, Dubai, Coimbatore
Work from Office
Please Note- Candidates okay to travel to Middle East only apply Role & responsibilities 1. Data Quality & Governance: Implement data quality frameworks, policies, and standards to ensure data accuracy, completeness, and consistency across enterprise systems. 2. Master Data Management (MDM) Implementation: Design and configure MDM solutions using Informatica MDM (On-Prem & Cloud) for key business domains (Customer, Product, Vendor, etc.). 3. Data Profiling & Cleansing: Leverage Informatica Data Quality for data profiling, cleansing, standardization, deduplication, and enrichment to improve data reliability. 4. Metadata Management & Data Lineage: Deploy and maintain Informatica Metadata Manager to enhance data discoverability, governance, and lineage tracking. 5. Integration & Interoperability: Ensure seamless integration of MDM and DQ solutions with core enterprise applications (ERP, CRM, BI tools), supporting ETL/ELT teams. 6. Stakeholder Collaboration: Act as a liaison between business and IT teams, translating business requirements into scalable MDM and DQ solutions. 7. Training & Support: Provide guidance, training, and best practices to data stewards and business users to drive a culture of data governance. Preferred candidate profile Education & Experience Bachelors/Masters degree in Computer Science, Data Management, Information Systems, or a related field. 5+ years of consulting experience in Data Quality, MDM, and Metadata Management Expertise in Informatica IDQ (or IDMC Cloud Data Quality), Informatica MDM (On-Prem & Cloud) Technical Skills Strong experience in data profiling, cleansing, standardization, and data deduplication. Hands-on knowledge of data governance frameworks, data quality rules, and stewardship best practices. Expertise in SQL, data modeling, and data architecture principles. Experience integrating MDM and DQ solutions with enterprise applications (SAP, Salesforce, Microsoft Dynamics, etc.). Familiarity with cloud platforms (MS Azure), with a focus on cloud-based data governance and integration. Experience in designing end-to-end DQ and MDM solutions Preferred Industry Experience Prior experience in DQ/MDM implementation within at least one of the following sectors: Oil & Gas, Financial Services, Manufacturing, Healthcare, Real Estate , Tourism , Government/Citizen Services , Mobility , Energy & Utilities, Telecom Consulting & Leadership Skills Strong stakeholder management and client engagement skills, with experience in working on DQ & MDM consulting projects. Pre-sales experience with the ability to build quick PoCs, Client demos, and support business development efforts.
Posted 3 days ago
3.0 - 7.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Position Overview: As a Senior Analytics Engineer, you will be responsible for the architecture, development and implementation of robust data engineering solutions to support our organizations data-driven initiatives to ensure data accuracy and enable data-driven decision-making across the organization. The ideal candidate will possess a minimum of 5 years of hands-on experience in data engineering on high-performing teams. Expertise in DataBricks, dbt, SQL, & Python is a critical requirement for this role. Key Responsibilities: 1. Data Pipeline Development: Design, develop, and maintain ETL (Extract, Transform, Load) pipelines primarily using Databricks, Dbt, and SQL to collect and process data from various sources into a centralized data warehouse. 2. Data Modeling: Create and maintain data models, data dictionaries, and documentation to support efficient data analysis and reporting. 3. Data Quality Assurance: Implement data validation and cleansing processes to ensure data accuracy, consistency, and reliability. 4. Data Governance: Ensure data security, compliance, and governance standards are met, and contribute to data governance initiatives. 5. Mentor and develop the analytics engineering team, and provide best practice guidance to broader analytics community. 6. Collaboration: Collaborate with cross-functional teams, including data scientists, business analysts, and stakeholders, to understand their data needs and deliver solutions. 7. Performance Optimization: Continuously monitor and optimize data pipelines and analytics processes for efficiency and scalability. 8. Adhoc Data Analysis and Reporting/Dashboard Development: Perform exploratory data analysis, develop data visualizations, and generate actionable insights to support business decision-making. 9. Stay Current: Stay up-to-date with emerging trends and technologies in data engineering and analytics, and make recommendations for their adoption. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Minimum 5+ years of hands-on experience in data engineering Expertise in building data pipelines and ETL processes using Data Bricks, Dbt and Python Strong understanding of data warehousing concepts and methodologies. Experience with cloud platforms such as Azure, AWS or GCP Excellent communication and interpersonal skills Excellent problem-solving skills and attention to detail Strong communication and teamwork abilities. Knowledge of data security and compliance standards is a plus
Posted 3 days ago
8.0 - 10.0 years
10 - 14 Lacs
Noida
Work from Office
8 to 10 years of relevant experience Strong understanding of ETL concepts and database principles. Strong Knowledge on Azure Cloud platform. Strong knowledge on Unix. Proficiency in SQL for data querying and validation. Experience with ETL tools . Excellent analytical and problem-solving skills. Strong communication and collaboration skills. Experience with data validation and data quality checks. Design, develop, and execute detailed test cases, test plans, and test scripts based on requirements and specifications. Agile Methodology along with JIRA experience Very good communication and interaction skills Strong collaboration skills / ability to work as a team player Experience working with onshore / multi-geography teams Must have: - Understanding of ETL Processes/ concepts Should have good knowledge of common ETL tools (like: Informatica, Azure Data Factory or any ETL tools) Database testing SQL/ Oracle queries Azure basics Agile Good communication Mandatory Competencies ETL - Tester ETL - Informatica ETL - Azure Data Factory Database - SQL Database - Oracle Agile - Agile QA - Agile Methodology Java - Unix Cloud - Azure QA - Testing Process QA - Test Tool and Automation QA Automation - QA Automation Beh - Communication and collaboration At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, were committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees success and happiness.
Posted 3 days ago
4.0 - 6.0 years
4 - 8 Lacs
Pune
Work from Office
Creating Passion: Your Responsibilities Roles & Responsibilities: Develop & Execute engineering data roadmap comprising CAD/PDM/PLM/ERP tools Creation & Maintenance of Master data & Parts lists Monitoring of Master Data Quality & Product Modification Management Implementation of changes from assembly groups in ERP system and the feature database Classification of different parts Supports for common rules & configuration in division Management of documents and numbering system Creation of evaluation in access & excel Coordination with external partners driving technology solution for engineering data management Simplify standardise and automate key engineering process including ECN, Drawing approvals, Drawing templates and integrate them with in engineering systems. Participate in Data Quality Audits, Product Carbon Footprint tasks Responsible for PLM/PDM customization. Contributing Your Strengths: Your Qualifications Qualification and Education Requirements: Bachelor s Degree in Engineering (preferably Information Technology) Excellent Verbal and written communication in English. Experience: Industry exposure of 4 to 6 years, preferably in manufacturing/similar industry. Preferred Skills / Special Skills: Certifications in PTC Windchill, PDM Link are mandatory Exposure to PTC Creo is preferred Ability to co-ordinate within mid-sized teams Communication with global stakeholders Commitment, result oriented and interested to learn new technologies. Have we awoken your interestThen we look forward to receiving your online application. If you have any questions, please contact Sonali Samal. One Passion. Many Opportunities. The company Liebherr CMCtec India Private Limited in Pune (India) was established in 2008 and started its manufacturing plant in its own facility on Pune Solapur Highway in 2012. The company is responsible for the production of tower cranes and drives. Location Liebherr CMCtec India Private Limited Sai Radhe, 6th Floor, Behind Hotel Sheraton Grand, Raja Bahadur Mill Road, Pune, 411001 Pune India (IN) Contact Sonali Samal sonali. samal@liebherr. com
Posted 3 days ago
3.0 - 6.0 years
4 - 8 Lacs
Bengaluru
Work from Office
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Python Professionals in the following areas : Job Description: Our Digital Service Line is currently looking for industry-leading seasoned "Data Engineer" professionals with hands-on experience. The shortlisted candidate should have the ability to analyse technical needs and work with the customers to develop project scope of work documents and Project Plans. The responsibilities are primarily technical, although there is a strong element of functional understanding of the business process. Data Engineering (DataEng) Experience 3 to 6 years Degree in computer science, engineering, or similar fields Mandatory Skill Set: Python, PySpark , SQL, AWS Designing , developing, testing and supporting data pipelines and applications. Good to Have : Palantir Foundry Primary Responsibilities: Responsible for designing, developing, testing and supporting data pipelines and applications Industrialize data feeds Creates data pipelines into existing systems Improves data cleansing and facilitates connectivity of data and applied technologies between both external and internal data sources. Collaboration with Data Scientist Establishes a continuous quality improvement process and to systematically optimizes data quality Translates data requirements from data users to ingestion activities B.Tech/ B.Sc./M.Sc. in Computer Science or related field and 3+ years of relevant industry experience Agile mindset and a spirit of initiative Interest in solving challenging technical problems Experience in creating productive and robust ETL pipelines for batch as well as streaming ingestion At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 3 days ago
3.0 - 8.0 years
5 - 9 Lacs
Coimbatore
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : React.js, Cloud Network OperationsMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and implement scalable applications using Google BigQuery.- Collaborate with cross-functional teams to ensure application functionality.- Conduct code reviews and provide technical guidance to junior developers.- Stay updated on industry trends and best practices in application development.- Troubleshoot and resolve application issues in a timely manner. Professional & Technical Skills: (Project specific)- BQ, BQ Geospatial, Python, Dataflow, Composer ,- Secondary skill -Geospatial Domain Knowledge"- Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 3 years of experience in Google BigQuery.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 3 days ago
15.0 - 20.0 years
5 - 9 Lacs
Ahmedabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP Data Migration Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and guidance to your team members while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Data Migration.- Strong understanding of data integration techniques and methodologies.- Experience with data mapping and transformation processes.- Familiarity with SAP modules and their data structures.- Ability to troubleshoot and resolve data migration issues efficiently. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP Data Migration.- This position is based at our Ahmedabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
3.0 - 8.0 years
4 - 8 Lacs
Coimbatore
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain data pipelines.- Ensure data quality throughout the data lifecycle.- Implement ETL processes for data migration and deployment.- Collaborate with cross-functional teams to understand data requirements.- Optimize data storage and retrieval processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data engineering principles.- Experience with cloud-based data services.- Knowledge of SQL and database management systems.- Hands-on experience with data modeling and schema design. Additional Information:- The candidate should have a minimum of 3 years of experience in Google BigQuery.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane