Jobs
Interviews

3 Step Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 years

0 Lacs

hyderabad, telangana, india

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant-Data Engineer, AWS+Python , Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available , and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master&rsquos Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

As a software developer, you will be working in a constantly evolving environment driven by technological advances and the strategic direction of the organization you are employed by. Your primary responsibilities will include creating, maintaining, auditing, and enhancing systems to meet specific needs, often based on recommendations from systems analysts or architects. You will be tasked with testing both hardware and software systems to identify and resolve system faults. Additionally, you will be involved in writing diagnostic programs and designing and developing code for operating systems and software to ensure optimal efficiency. In situations where necessary, you will also provide recommendations for future developments. Joining us offers numerous benefits, including the opportunity to work on challenging projects and solve complex technical problems. You can expect rapid career growth and the chance to assume leadership roles. Our mentorship program allows you to learn from experienced mentors and industry experts, while our global opportunities enable you to collaborate with clients from around the world and gain international experience. We offer competitive compensation packages and benefits to our employees. If you are passionate about technology and interested in working on innovative projects with a skilled team, pursuing a career as an Infosys Power Programmer could be an excellent choice for you. To be considered for this role, you must possess the following mandatory skills: - Proficiency in AWS Glue, AWS Redshift/Spectrum, S3, API Gateway, Athena, Step, and Lambda functions. - Experience with Extract Transform Load (ETL) and Extract Load & Transform (ELT) data integration patterns. - Expertise in designing and constructing data pipelines. - Development experience in one or more object-oriented programming languages, preferably Python. In terms of job specifications, we are looking for candidates who meet the following criteria: - At least 5 years of hands-on experience in developing, testing, deploying, and debugging Spark Jobs using Scala in the Hadoop Platform. - Profound knowledge of Spark Core and working with RDDs and Spark SQL. - Familiarity with Spark Optimization Techniques and Best Practices. - Strong understanding of Scala Functional Programming concepts like Try, Option, Future, and Collections. - Proficiency in Scala Object-Oriented Programming covering Classes, Traits, Objects (Singleton and Companion), and Case Classes. - Sound knowledge of Scala Language Features including the Type System and Implicit/Givens. - Hands-on experience working in the Hadoop Environment (HDFS/Hive), AWS S3, EMR. - Proficiency in Python programming. - Working experience with Workflow Orchestration tools such as Airflow and Oozie. - Experience with API calls in Scala. - Familiarity and exposure to file formats like Apache AVRO, Parquet, and JSON. - Desirable knowledge of Protocol Buffers and Geospatial data analytics. - Ability to write test cases using frameworks like scalatest. - Good understanding of Build Tools such as Gradle & SBT. - Experience using GIT, resolving conflicts, and working with branches. - Preferred experience in workflow systems like Airflow. - Strong programming skills focusing on data structures and algorithms. - Excellent analytical and communication skills. Candidates applying for this position should have: - 7-10 years of industry experience. - A BE/B.Tech in Computer Science or an equivalent qualification.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You are seeking experienced Teamcenter PLM migration professionals with a background in CAD and legacy system data migrations to Teamcenter Unified (TCUA). If you have hands-on experience with tools such as csv2tcxml, tcxml, and are familiar with TCIN, TCIC, BMIDE, and ITK, we invite you to apply for positions available as Consultants, Developers, and Architects, depending on your experience and technical depth. Open Positions & Specializations: - NX to Teamcenter Migration Consultant: Mandatory experience in CAD NX to TC migration, analyzing and cleansing NX data, working with tcin_import.exe, TCIN configuration, and Teamcenter-NX integration. - CATIA to Teamcenter Migration Consultant: Mandatory experience in CATIA V5/V6 to TC migration, analyzing CATIA data, scripting for CATIA API / CATScript, setting up and managing TCIC and TC-CATIA integrations. - Teamcenter Enterprise to Unified Migration Consultant: Experience in extracting and transforming data from TC Enterprise, mapping legacy structures to TCUA datamodel using BMIDE. - Legacy System (SAP / Enovia) to Teamcenter Migration Consultant: Experience in migrating from legacy PLM systems like SAP PLM or Enovia to Teamcenter, with skills in data cleansing, extraction, and transformation. Key Responsibilities (All Roles): - Analyze source system data, identify and fix inconsistencies. - Design and execute robust data migration pipelines using Teamcenter migration tools like csv2tcxml, tcxml export/import, and ips upload. - Perform data cleansing, transformation, mapping, and validation. - Customize or support batch migration utilities using tools such as Teamcenter ITK, XML, Batch Scripting, SQL, Python, Java. - Configure and use modules like My Teamcenter, Structure Manager, Organization, Classification, Projects. - Collaborate with cross-functional teams and lead technical discussions. For Architect roles, own the migration plan, lead teams, manage client communication, and ensure successful project delivery. Required Skills: - 5+ years of experience in Teamcenter PLM. - Proven experience with at least one migration stream (NX / CATIA / TC Ent / SAP / Enovia). - Strong command of BMIDE, Teamcenter datamodel, and database schema. - Solid understanding of CAD-PLM integrations and import/export workflows. - Strong scripting and automation skills in ITK, SQL, Batch, Python, Java, XML. - Excellent communication, ownership, and defect-resolution skills. Nice-to-Have Skills: - NX Open API or CATIA API / CATScript. - Experience working with STEP, PDF generation workflows. - Prior experience in multi-CAD or multi-site Teamcenter environments. Why Join Us - Work on enterprise-level migration projects for global clients. - Contribute to cutting-edge PLM transformations. - Leadership and growth opportunities (Developer Architect track). - Collaborative team culture, continuous learning, and certifications.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies