Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 6.0 years
6 - 7 Lacs
Chennai
Hybrid
Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Speciality Development Practitioner Location: Chennai Work Type: Hybrid Position Description: 5+ years of Data Engineering experience 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale Strong understanding of key GCP services, especially those related to data processing [Batch/Real Time] Big Query, Cloud Scheduler, Airflow, Postgres, Data Flow, Pub/Sub, Cloud Logging and Cloud Monitoring Experience with infrastructure as code Terraform/ GitHub Experience in design, development and implementation of data pipelines using Data Warehousing applications Experience in integrating various data sources like Oracle, Teradata, DB2, Big Query & Flat files Hands on experience in performance tuning and debugging ETL jobs Involving in review meetings and coordinating with the team in job designing and fine-tuning the job performance. Skills Required: Big Query,, Data Flow Experience Required: Minimum 5+ years Education Required: Bachelor's Degree TekWissen Group is an equal opportunity employer supporting workforce diversity.
Posted 2 months ago
4.0 - 7.0 years
7 - 11 Lacs
Noida
Work from Office
Key Responsibilities: 1.Support the design and implementation of Data warehousing solutions using Oracle DWH. 2.Develop and maintain Python and Spark scripts to process and transform large datasets. 3.Collaborate with analysts and Bl developers to ensure data availability for tableau dashboards. 4.Perform DQ checks and resolve data inconsistencies. 5.Document data flows, transformations and job schedules. Must Have: 1.Expertise on Oracle SQL 2.Hands on with Python and Spark 3.Familiarity with ETL concepts and data modeling basics. 4.Exposure to reporting tools like Tableau Mandatory Competencies Big Data - Big Data - SPARK ETL - ETL - Data Stage Beh - Communication and collaboration Programming Language - Python - Python Shell ETL - ETL - Ab Initio BI and Reporting Tools - BI and Reporting Tools - Tableau QA/QE - QA Analytics - Data Analysis Database - Database Programming - SQL
Posted 2 months ago
3.0 - 6.0 years
14 - 18 Lacs
Mumbai
Work from Office
Overview Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference. Market & other critical datapoints to various products of the firm. The platform, hosted on firms’ data centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7. With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress. To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure. The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers. The position is based in Mumbai, India office. Responsibilities Build and maintain ETL pipelines for Snowflake. Manage Snowflake objects and data models. Integrate data from various sources. Optimize performance and query efficiency. Automate and schedule data workflows. Ensure data quality and reliability. Collaborate with cross-functional teams. Document processes and data flows. Qualifications Self-motivated, collaborative individual with passion for excellence B.E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Databases Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Experience managing Snowflake databases, schemas, tables, and other objects Proficient in Snowflake SQL, including CTEs, window functions, and stored procedures Familiar with Snowflake performance tuning and cost optimization tools Skilled in building ETL/ELT pipelines using dbt, Airflow, or Python Able to work with various data sources including RDBMS, APIs, and cloud storage Understanding of incremental loads, error handling, and scheduling best practices Strong SQL skills and intermediate Python proficiency for data processing Familiar with Git for version control and collaboration Basic knowledge of Azure, or GCP cloud platforms Capable of integrating Snowflake with APIs and cloud-native services What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 2 months ago
1.0 - 5.0 years
5 - 8 Lacs
Mumbai
Work from Office
Sweet Dreams Retail Private Limited is looking for Lead Graphics Designer - Fashion & Retail to join our dynamic team and embark on a rewarding career journeyCollaborating with clients or team members to determine design requirements and project goalsDeveloping and creating visual contentSelecting and manipulating appropriate images, fonts, and other design elements to enhance the visual impact of designsUsing graphic design software, such as Adobe Photoshop, Illustrator, and InDesign, to produce final designsPresenting design concepts and presenting revisions to clients or team membersManaging multiple projects and meeting tight deadlinesEnsuring designs meet brand guidelines and quality standards
Posted 2 months ago
5.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Role & Apache Nifi 5+ years of hands-on experience with Apache NiFi, including developing, managing, and optimizing complex data flows in production environments. Proven experience with Cloudera NiFi (CDP Data Flow) in enterprise environments, including integration with Cloudera Manager. Experience migrating NiFi flows across major version upgrades with strong understanding of backward compatibility Strong proficiency in Groovy scripting, used for ExecuteScript and InvokeScriptedProcessor processors. Solid understanding of SSH and SFTP protocols, including authentication schemes (key-based, password), session negotiation, and file permissions handling in NiFi processors (e.g., ListSFTP, FetchSFTP, PutSFTP). Good grasp of data encryption mechanisms, key management, and secure flowfile handling using processors like EncryptContent. Experience integrating NiFi with MongoDB, including reading/writing documents via processors like GetMongo, PutMongo, and QueryMongo. Experience working with Apache Kafka, including producing and consuming from Kafka topics using NiFi (PublishKafka, ConsumeKafka), and handling schema evolution with Confluent Schema Registry. Strong knowledge of Red Hat Enterprise Linux (RHEL) environments, including systemd services, filesystem permissions, log rotation, and resource tuning for JVM-based applications like NiFi. NiFi-Specific Technical Requirements: In-depth knowledge of NiFi flow design principles, including proper use of queues, back pressure, prioritizers, and connection tuning. Mastery of controller services, including SSLContextService, DBCPConnectionPool, and RecordReader/RecordWriter services. Experience with Record-based processing using Avro, JSON, CSV schemas and Record processors like ConvertRecord, QueryRecord, and LookupRecord. Ability to debug and optimize NiFi flows using Data Provenance, bulletins, and log analysis. Familiarity with custom processor development in Java/Groovy (optional but preferred). Experience setting up secure NiFi clusters, configuring user authentication (LDAP, OIDC), TLS certificates, and access policies. Proficiency in parameter contexts, variable registry, and flow versioning using NiFi Registry. Understanding of Zero-Master clustering model, node coordination, and site-to-site protocol. Experience deploying and monitoring NiFi in high-availability, production-grade environments, including using Prometheus/Grafana or Cloudera Manager for metrics and alerting. Preferred Qualifications: Experience working in regulated or secure environments, with strict data handling and audit requirements. Familiarity with DevOps workflows, including version-controlled flow templates (JSON/XML), CI/CD integration for NiFi Registry, and automated deployment strategies. Strong written and verbal communication skills, with ability to document flows and onboard other engineers. responsibilities Preferred candidate profile
Posted 2 months ago
4.0 - 8.0 years
0 - 0 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities Develop and maintain responsive front-end applications using Vue.js , Vuex , and Vue Router . Collaborate with UI/UX designers to translate wireframes into functional and engaging web experiences. Integrate RESTful APIs and ensure seamless data flow between front-end and back-end systems. Optimize application performance and ensure cross-browser compatibility. Write clean, maintainable, and reusable code following best practices and coding standards. Participate in Agile ceremonies, code reviews, and team discussions to improve product quality. Must-Have Skills: 4+-8 years of hands-on experience in Vue.js (2.x or 3.x) . Strong understanding of JavaScript (ES6+) , HTML5 , and CSS3 . Experience in Vuex , Vue Router , and component-based architecture . Proficiency in RESTful API integration and JSON handling. Familiarity with build tools like Webpack , Vite , or npm scripts. Experience with version control (Git) and branching strategies. Good to Have: Exposure to TypeScript with Vue. Familiarity with unit testing frameworks like Jest , Mocha , or Cypress . Knowledge of state management libraries (Pinia, Redux, etc.). Experience working in a microfrontend or modular UI architecture . Awareness of accessibility (WCAG) and performance optimization techniques.
Posted 2 months ago
5.0 - 10.0 years
17 - 20 Lacs
Mumbai
Work from Office
KEY RESPONSIBILITIES Should be able to multitask and manage various projects Should be able to evaluate business processes, anticipate requirements, uncover areas for improvement, and developing and implementing solutions. Should be able to communicate the documented business requirements with the technical team. Reviewing FSD prepared by IT and ensuring all client requirements are met Should be able to prepare and review manual test scenarios and test cases Should be able to liase with IT vendors for their project deliveries/bugs etc. Leading UAT team for project closure ensuring that complete functional and regression testing is done before go live Ensuring smooth deployment and monitoring in production end ensuring post production signoff Maintaining project tracker INTERACTIONS Internal Relations New Business Team, Other Projects Team, IT Team External Relations IT vendors such as CTS, Hansa, Posidex REQUIRED QUALIFICATION AND SKILLS Educational Qualifications Graduate in Science IT Postgraduate in Science/ IT/ MBA Work Experience Should have atleast 5 years + work experience as project management preferably in the BFSI domain (Experience in Insurance domain would be added advantage) Certifications Testing certificate Other skill set Team management capability Should be aggressive and take responsibility of completing multiple projects within defined timelines Should have good oral and written communication skills Should be able to analyse all outcomes of a situation and take steps accordingly
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
Your role: As a passionate technical development lead specializing in the GRC space, you will play a crucial role in delivering efficient configuration and customization of large GRC installations. Your responsibilities will include providing technical leadership, working on strategic programs, and collaborating with a high-performance team. You should possess a strong analytical and technical ability with a minimum of 5 years of experience in Cloud, DevOps, UI frameworks, Java, REST, and DB skills. The ideal candidate will be able to work independently, communicate effectively, and have a proven track record of working on large complex development projects. You will leverage your skills and knowledge to develop high-quality solutions that meet business needs while driving continuous integration and improvements. Additionally, you will collaborate with the Senior Tech Lead & Solution Architect to provide valuable inputs on application design. Your team: You will be part of the Compliance & Operational Risk IT team, a global team responsible for designing and implementing innovative IT solutions to track complex regulatory requirements in the financial services industry. The team is spread across various locations including the US, UK, Switzerland, and India, providing support to internal clients worldwide. Your expertise: - CI/CD pipeline creation and deployment into production (incl. GitOps practices) enabling Canary / rolling deployment, Blue/Green, Feature Flags Observability of deployments, chaos engineering - Relational DBs (SQL / PostgresSQL) - Data Flow and ETL (e.g. Airflow/Dagster similar) - JVM based languages (Java8/11+, Scala, Kotlin) - Knowledge on M7 preferred (eve About Company: Purview is a leading Digital Cloud & Data Engineering company with headquarters in Edinburgh, United Kingdom and a presence in 14 countries including India, Poland, Germany, Finland, Netherlands, Ireland, USA, UAE, Oman, Singapore, Hong Kong, Malaysia, and Australia. The company has a strong presence in the UK, Europe, and APEC regions, providing services to Captive Clients and top IT tier 1 organizations. Company Info: India Office: 3rd Floor, Sonthalia Mind Space Near Westin Hotel, Gafoor Nagar Hitechcity, Hyderabad Phone: +91 40 48549120 / +91 8790177967 UK Office: Gyleview House, 3 Redheughs Rigg, South Gyle, Edinburgh, EH12 9DQ. Phone: +44 7590230910 Email: careers@purviewservices.com Login to Apply!,
Posted 2 months ago
12.0 - 17.0 years
27 - 35 Lacs
Madurai, Chennai
Work from Office
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra Clients Vision is our Mission. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Job Title: GCP Data Architect Location: Madurai/Chennai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications: GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer: Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership
Posted 2 months ago
6.0 - 10.0 years
14 - 19 Lacs
Pune
Work from Office
Project description The Mapping Developer will be responsible for implementing business requests in the financial messaging domain, with a strong focus on ISO 20022, SWIFT FIN, and other market formats. The role involves data conversion using a low-code workbench, collaborating with stakeholders, and ensuring high-quality software solutions through best engineering practices. The position requires close coordination with other teams to support messaging data flows across the bank. Responsibilities Take ownership of business requirements from analysis through to implementation. Implement data conversion logic using a low-code workbench applied across multiple applications and services. Collaborate with stakeholders to gather requirements and deliver precise, well-tested solutions. Apply modern software engineering principles including Git version control, unit testing, and CI/CD deployments. Define and execute unit tests to maintain high code quality. Analyze and resolve issues in test and production environments on a priority basis. Coordinate with other teams to enable smooth and accurate messaging data flows within the bank Skills Must have 6-10 years of experience in IT as a Technical Analyst, Data Modeler, or similar role. Hands-on experience with Core Java development and software engineering practices. Proficient in analyzing and modeling complex data structures and requirements. Understanding of basic programming conceptsvariables, conditions, operators, loops, etc. Familiarity with XML data and XSD schemas. Knowledge of Git and CI/CD tools for code versioning and deployment. Strong attention to detail and ability to deliver high-quality, testable code. Nice to have Experience working with financial messaging standards such as SWIFT FIN and ISO 20022. Exposure to low-code platforms used for data conversion. Ability to work in cross-functional teams and contribute to collaborative environments. Strong problem-solving skills and ability to perform ad hoc issue analysis.
Posted 2 months ago
7.0 - 12.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Date 25 Jun 2025 Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling, and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Your future role Take on a new challenge and apply your data engineering expertise in a cutting-edge field. Youll work alongside collaborative and innovative teammates. You'll play a key role in enabling data-driven decision-making across the organization by ensuring data availability, quality, and accessibility. Day-to-day, youll work closely with teams across the business (e.g., Data Scientists, Analysts, and ML Engineers), mentor junior engineers, and contribute to the architecture and design of our data platforms and solutions. Youll specifically take care of designing and developing scalable data pipelines, but also managing and optimizing object storage systems. Well look to you for: Designing, developing, and maintaining scalable and efficient data pipelines using tools like Apache NiFi and Apache Airflow. Creating robust Python scripts for data ingestion, transformation, and validation. Managing and optimizing object storage systems such as Amazon S3, Azure Blob, or Google Cloud Storage. Collaborating with Data Scientists and Analysts to understand data requirements and deliver production-ready datasets. Implementing data quality checks, monitoring, and alerting mechanisms. Ensuring data security, governance, and compliance with industry standards. Mentoring junior engineers and promoting best practices in data engineering. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in Python and data processing libraries (e.g., Pandas, PySpark). Hands-on experience with Apache NiFi for data flow automation. Deep understanding of object storage systems and cloud data architectures. Proficiency in SQL and experience with both relational and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP). Exposure to the Data Science ecosystem, including tools like Jupyter, scikit-learn, TensorFlow, or MLflow. Experience working in cross-functional teams with Data Scientists and ML Engineers. Cloud certifications or relevant technical certifications are a plus. Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges, and a long-term career free from boring daily routines. Work with advanced data and cloud technologies to drive innovation. Collaborate with cross-functional teams and helpful colleagues. Contribute to innovative projects that have a global impact. Utilise our flexible and hybrid working environment. Steer your career in whatever direction you choose across functions and countries. Benefit from our investment in your development, through award-winning learning programs. Progress towards leadership roles or specialized technical paths. Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension). You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.
Posted 2 months ago
5.0 - 8.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Job Title: SAP CPI (Cloud Platform Integration) Consultant Summary: We are l ooking for an experienced SAP CPI Consultan t to design, develop, and manage integration solutions using SAP Cloud Platform Integration. The ideal candidate will have a strong background in SAP integration technologies and a deep understanding of business processes and data flow between SAP and third-party systems. Key Responsibilities: Design and implement integration solutions using SAP CPI for cloud and hybrid environments. Collaborate with business analysts and stakeholders to gather and analyze integration requirements. Configure adapters, mappings, and integration flows in CPI. Develop and maintain technical documentation for integration processes. Monitor and troubleshoot integration scenarios to ensure data accuracy and performance. Support existing integrations and implement enhancements as needed. Stay current with SAP CPI updates, best practices, and emerging technologies. Required Qualifications: 7+ years of experience in SAP CPI or SAP PI/PO, Proficiency in XML, JSON, SOAP, REST, and Web Services. Strong understanding of SAP modules such as S/4HANA, ERP, and SuccessFactors, Experience with integration tools and middleware platforms. Mandatory Skills: SAP HANA Cloud Integration. Experience: 5-8 Years.
Posted 2 months ago
3.0 - 7.0 years
6 - 16 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers for Chennai Location . Hope you are doing well! This is Jogeshwari from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to jogeshwari.k@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 3+ Years in IT and minimum 2+ years in GCP Data Engineering Location : Chennai Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 6+ years of professional experience: Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Jogeshwari Senior Specialist
Posted 2 months ago
3.0 - 5.0 years
13 - 17 Lacs
Gurugram
Work from Office
Senior Analyst-GCP Data Engineer: Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With a presence in more than 45 countries across five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients' business impact and strategic decision-making. Our team of over 4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-based culture that prioritizes continuous learning and skill development and work-life balance. About Data Analytics (DA) Data Analytics is one of the highest growth practices within Evalueserve, providing you rewarding career opportunities. Established in 2014, the global DA team extends beyond 1000+ (and growing) data science professionals across data engineering, business intelligence, digital marketing, advanced analytics, technology, and product engineering. Our more tenured teammates, some of whom have been with Evalueserve since it started more than 20 years ago, have enjoyed leadership opportunities in different regions of the world across our seven business lines. What you will be doing at Evalueserve Data Pipeline Development: Design and implement scalable ETL (Extract, Transform, Load) pipelines using tools like Cloud Dataflow, Apache Beam or Spark and BigQuery. Data Integration: Integrate various data sources into unified data warehouses or lakes, ensuring seamless data flow. Data Transformation: Transform raw data into analyzable formats using tools like dbt (data build tool) and Dataflow. Performance Optimization: Continuously monitor and optimize data pipelines for speed, scalability, and cost-efficiency. Data Governance: Implement data quality standards, validation checks, and anomaly detection mechanisms. Collaboration: Work closely with data scientists, analysts, and business stakeholders to align data solutions with organizational goals. Documentation: Maintain detailed documentation of workflows and adhere to coding standards. What were looking for Proficiency in **Python/PySpark and SQL for data processing and querying. Expertise in GCP services like BigQuery, Cloud Storage, Pub/Sub, Cloud composure and Dataflow. Familiarity with Datawarehouse and lake house principles and distributed data architectures. Strong problem-solving skills and the ability to handle complex projects under tight deadlines. Knowledge of data security and compliance best practices. Certification: GCP Professional Data engineer. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking onachievements AI-poweredsupply chain optimization solution built on Google Cloud. HowEvalueserve isnow Leveraging NVIDIA NIM to enhance our AI and digital transformationsolutions and to accelerate AI Capabilities . Knowmore about how Evalueservehas climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer : The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances. Please Note: We appreciate the accuracy and authenticity of the information you provide, as it plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure all information is factual and submitted on time. For any assistance, your TA SPOC is available to support you.
Posted 2 months ago
4.0 - 9.0 years
10 - 18 Lacs
Chennai
Hybrid
Role & responsibilities Bachelors Degree 2+Years in GCP Services - Biq Query, Data Flow, Dataproc, DataPlex,DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years inData Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strongexperience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobsfor data importing/exporting Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API
Posted 2 months ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Data Engineer (ETL, Big Data, Hadoop, Spark, GCP) at Assistant Vice President level, located in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong understanding of crucial engineering principles within the bank, and be skilled in root cause analysis through addressing enhancements and fixes in product reliability and resiliency. Working independently on medium to large projects with strict deadlines, you will collaborate in a cross-application technical environment, demonstrating a solid hands-on development track record within an agile methodology. Furthermore, this role involves collaborating with a globally dispersed team and is integral to the development of the Compliance tech internal team in India, delivering enhancements in compliance tech capabilities to meet regulatory commitments. Your key responsibilities will include analyzing data sets, designing and coding stable and scalable data ingestion workflows, integrating them with existing workflows, and developing analytics algorithms on ingested data. You will also be working on data sourcing in Hadoop and GCP, owning unit testing, UAT deployment, end-user sign-off, and production go-live. Root cause analysis skills will be essential for identifying bugs and issues, and supporting production support and release management teams. You will operate in an agile scrum team and ensure that new code is thoroughly tested at both unit and system levels. To excel in this role, you should have over 10 years of coding experience with reputable organizations, hands-on experience in Bitbucket and CI/CD pipelines, and proficiency in Hadoop, Python, Spark, SQL, Unix, and Hive. A basic understanding of on-prem and GCP data security, as well as hands-on development experience with large ETL/big data systems (with GCP experience being a plus), are required. Familiarity with cloud services such as cloud build, artifact registry, cloud DNS, and cloud load balancing, along with data flow, cloud composer, cloud storage, and data proc, is essential. Additionally, knowledge of data quality dimensions and data visualization is beneficial. You will receive comprehensive support, including training and development opportunities, coaching from experts in your team, and a culture of continuous learning to facilitate your career progression. The company fosters a collaborative and inclusive work environment, empowering employees to excel together every day. As part of Deutsche Bank Group, we encourage applications from all individuals and promote a positive and fair workplace culture. For further details about our company and teams, please visit our website: https://www.db.com/company/company.htm.,
Posted 2 months ago
1.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Role Overview We are looking for a driven and detail-oriented professional to join our team. The candidate is expected to gain a strong understanding of internal processes and data flows, proactively analyze issues, and drive automation initiatives across various systems and reporting functions. Key Responsibilities Understand business processes and data flows to identify bottlenecks and improvement areas. Analyze and troubleshoot system errors or anomalies independently. Automate manual processes and reports to increase efficiency and accuracy. Collaborate effectively with cross-functional teams, including external vendors and partners, to ensure seamless transaction flows and compliance with regulatory/statutory requirements Define, develop, and communicate key performance metrics and insights to stakeholders and senior management Ideal Candidate Profile Exceptional written and verbal communication skills. Strong analytical and problem-solving abilities Ability to multitask and manage a wide variety of tasks in a dynamic environment Technical expertise in Java, Python, or similar programming languages Familiarity with process automation tools and data analysis frameworks is a plus Understanding of stock broking, trading, or financial services is an added advantage Qualification Bachelors degree in Engineering, Computer Science, Mathematics, Statistics, or a related discipline OR an MBA from a reputed institute Minimum 1+ year of experience in a relevant role involving analytics, automation, or operations in a technical domain Why Join Us Work on real-time, high-impact projects that shape core operations. Gain cross-functional exposure in financial services, technology, and compliance Be part of a collaborative and innovation-driven team PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog. Life at PhonePe PhonePe in the news
Posted 2 months ago
4.0 - 7.0 years
0 Lacs
Pune
Hybrid
Job Title: GCP Data Engineer Location: Pune, India Experience: 4 to 7 Years Job Type: Full-Time Job Summary: We are looking for a highly skilled GCP Data Engineer with 4 to 7 years of experience to join our data engineering team in Pune . The ideal candidate should have strong experience working with Google Cloud Platform (GCP) , including Dataproc , Cloud Composer (Apache Airflow) , and must be proficient in Python , SQL , and Apache Spark . The role involves designing, building, and optimizing data pipelines and workflows to support enterprise-grade analytics and data science initiatives. Key Responsibilities: Design and implement scalable and efficient data pipelines on GCP , leveraging Dataproc , BigQuery , Cloud Storage , and Pub/Sub. Develop and manage ETL/ELT workflows using Apache Spark , SQL , and Python. Orchestrate and automate data workflows using Cloud Composer (Apache Airflow). Build batch and streaming data processing jobs that integrate data from various structured and unstructured sources. Optimize pipeline performance and ensure cost-effective data processing. Collaborate with data analysts, scientists, and business teams to understand data requirements and deliver high-quality solutions. Implement and monitor data quality checks, validation, and transformation logic. Required Skills: Strong hands-on experience with Google Cloud Platform (GCP) Proficiency with Dataproc for big data processing and Apache Spark Expertise in Python and SQL for data manipulation and scripting Experience with Cloud Composer / Apache Airflow for workflow orchestration Knowledge of data modeling, warehousing, and pipeline best practices Solid understanding of ETL/ELT architecture and implementation Strong troubleshooting and problem-solving skills Preferred Qualifications: GCP Data Engineer or Cloud Architect Certification. Familiarity with BigQuery , Dataflow , and Pub/Sub. Experience with CI/CD and DevOps tools in data engineering workflows. Exposure to Agile methodologies and team collaboration tools.
Posted 2 months ago
3.0 - 8.0 years
4 - 8 Lacs
Chennai
Work from Office
About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse, Core Banking Good to have skills : AWS BigDataMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the optimization of data processing workflows to enhance efficiency.- Collaborate with stakeholders to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Core Banking.- Good To Have Skills: Experience with AWS BigData.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with data governance and data quality frameworks. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
15.0 - 20.0 years
3 - 7 Lacs
Navi Mumbai
Work from Office
About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : SAP EWM Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and resolves issues within various components of critical business systems. Your typical day will involve collaborating with team members to troubleshoot problems, analyzing system performance, and ensuring that all applications run smoothly to support business operations effectively. You will engage with different teams to gather insights and feedback, which will help in enhancing system functionality and user experience. Your role will be pivotal in maintaining the integrity and efficiency of the software systems that drive the organizations success. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their skills and knowledge.- Monitor system performance and proactively identify areas for improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Warehouse Management System (WMS).- Strong analytical skills to diagnose and resolve software issues.- Experience with system integration and data flow management.- Familiarity with troubleshooting methodologies and best practices.- Ability to work collaboratively in a team-oriented environment. Additional Information:- The candidate should have minimum 5 years of experience in SAP Warehouse Management System (WMS).- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle CC&B Technical Architecture Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary As a Technical Interfaces Consultant specializing in Oracle Utilities applications, you will be responsible for designing, developing, and maintaining integrations between Oracle CC&B and various edge systems. This role involves close collaboration with onshore and offshore teams to ensure seamless data exchange and system interoperability, aligning with business requirements and industry best practices. Roles & Responsibilities:-Design and implement integrations between Oracle CC&B and external systems such as MDM, AMI, IVR, payment gateways, and CRM platforms.-Develop and maintain web services (SOAP/REST), XAI services, and batch interfaces to facilitate data exchange.-Ensure data integrity and consistency across integrated systems-Analyze system requirements and translate them into technical specifications for interface development.-Monitor interface performance and troubleshoot issues related to data synchronization, latency, and errors.-Collaborate with cross-functional teams to resolve integration-related challenges.-Create and maintain comprehensive documentation for interface designs, configurations, and workflows.-Ensure compliance with organizational policies, industry standards, and regulatory requirements related to data integration.-Work closely with onshore counterparts to gather requirements, provide updates, and align on project goals.-Participate in regular meetings and status reviews to ensure transparency and accountability.-Provide knowledge transfer and training to team members as needed. Professional & Technical Skills: - Proficiency in Oracle Utilities Application Framework (OUAF), Java, PL/SQL, and web services development.-Strong understanding of utility business processes and data flow between systems.-Experience with tools and technologies such as XML, XSLT, SOAP, REST, and middleware platforms.-Experience with Oracle Customer to Meter (C2M) and Meter Data Management (MDM) systems.-Familiarity with Agile or Scrum project management methodologies.-Knowledge of security protocols and best practices for data integration.-Strong analytical and problem-solving skills.-Excellent communication and interpersonal skills. Additional Information -6 to 10 years of Experience with at least 2 to 3 projects implementation experience in Oracle Utilities Application Framework.-Minimum of 5 years of experience in integrating Oracle CC&B with external systems.-Minimum 15 years of full-time education Degree in Engineering will be a plus-This position is based at our Bangalore office. Qualification 15 years full time education
Posted 2 months ago
2.0 - 7.0 years
18 - 22 Lacs
Bengaluru
Work from Office
About The Role Job Title Data Science/Data Engineering ML9 - Sales Excellence COE - Data Engineering Specialist Management Level :ML9 Location:Open Must have skills: strong GCP cloud technology experience, Big query, Data Science basics. Good to have skills: Build and maintain data models from different sources. Experience: Minimum 5 year(s) of experience is required Educational Qualification: Graduate/Post graduate Job Summary :The Center of Excellence (COE) makes sure that the sales and pricing methods and offerings of Sales Excellence are effective. The COE supports salespeople through its business partners and Analytics and Sales Operations teams. The Data Engineer helps manage data sources and environments, utilizing large data sets and maintaining their integrity to create models and apps that deliver insights to the organization. Roles & Responsibilities: Build and manage data models that bring together data from different sources. Understand the existing Data Model in SQL Server and help redesign/migrate in GCP Bigquery. Help consolidate and cleanse data for use by the modeling and development teams. Structure data for use in analytics applications. Lead a team of Data Engineers effectively. Professional & Technical Skills: A bachelors degree or equivalent A Minimum of2 years of strong GCP cloud technology experience A minimum of 2 years Advanced SQL knowledge and experience working with relational databases A minimum of 2 years Familiarity and hands on experience in different SQL objects like stored procedures, functions, views etc., A minimum of 2 years Building of data flow components and processing systems to extract, transform, load and integrate data from various sources. A basic knowledge of Data Science models and tools. Additional Information: Extra credit if you have: Understanding of sales processes and systems. Masters degree in a technical field. Experience with Python. Experience with quality assurance processes. Experience in project management. You May Also Need: Ability to work flexible hours according to business needs. Must have good internet connection and a distraction-free environment for working at home, in accordance with local guidelines. About Our Company | Accenture (do not remove the hyperlink)Qualification Experience: Minimum 5 year(s) of experience is required Educational Qualification: Graduate/Post graduate
Posted 2 months ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role Project Role : Analytics Practitioner Project Role Description : Develop business insights using predictive statistical modeling activities. Use analytical tools and techniques to generate and improve decision-making by modeling client, market, and key performance data. Must have skills : Python (Programming Language) Good to have skills : Amazon Web Services (AWS), Machine LearningMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Analytics Practitioner, you will engage in developing business insights through predictive statistical modeling activities. Your typical day will involve utilizing various analytical tools and techniques to generate and enhance decision-making processes by modeling client, market, and key performance data. You will collaborate with cross-functional teams to ensure that the insights derived are actionable and aligned with business objectives, ultimately contributing to the overall success of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and evaluate the effectiveness of implemented solutions and strategies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language), Gen AI, Agentic, & Machine Learning- Good To Have Skills: Experience with Amazon Web Services (AWS), Machine Learning.- Strong understanding of data analysis techniques and methodologies.- Experience with data visualization tools to present insights effectively.- Familiarity with statistical modeling and predictive analytics. Additional Information:- The candidate should have minimum 5 years of experience in Python (Programming Language).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
15.0 - 20.0 years
10 - 14 Lacs
Hyderabad
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BTP Integration Suite Good to have skills : SAP PO/PI & APIs DevelopmentMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, ensuring that the applications meet the required standards and functionality while adapting to any changes in project scope or requirements. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BTP Integration Suite.- Good To Have Skills: Experience with SAP PO/PI & APIs Development.- Strong understanding of application design principles and best practices.- Experience in managing application lifecycle and deployment processes.- Familiarity with integration patterns and data flow management. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP BTP Integration Suite.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
6.0 - 11.0 years
7 - 17 Lacs
Gurugram
Work from Office
We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer,Developing end to end ETL/ELT Pipeline.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |