Job Title: Data Warehouse Developer Mainframe Specialist Experience: 4+ Years Location: Pan India Employment Type: Full-time Job Summary: We are seeking a Data Warehouse Developer with strong experience in mainframe-based data processing and ETL development. The ideal candidate will have a solid understanding of DWH concepts and hands-on experience working with mainframe technologies to support large-scale data integration and reporting solutions. Key Responsibilities: Design, develop, and maintain ETL workflows using mainframe tools and technologies (e.g., JCL, COBOL, DB2). Perform data extraction, transformation, and loading from legacy systems into enterprise data warehouses. Collaborate with business analysts and data architects to understand data requirements and deliver scalable solutions. Ensure data quality, consistency, and performance across large datasets. Participate in performance tuning, troubleshooting, and production support activities. Primary Skills: Mainframe Technologies: JCL, COBOL, DB2 (hands-on) Data Warehousing: Strong understanding of DWH concepts and architecture Good to Have: Experience with ETL tools like IBM DataStage or Informatica Familiarity with scripting (Unix) and modern languages (Python) Exposure to orchestration tools like Apache Airflow Preferred Qualifications: Experience in Agile development environments Exposure to cloud-based data platforms is a plus Strong analytical and communication skills Role & responsibilities Preferred candidate profile
Role Data Engineer Years of Experience 8-12Yrs Preferred Location HYD 1 Shift Timing (IST) 11:00 AM - 08:30 PM Short Description Data Engineer with ADF, Databricks Anticipated Onboarding Date 1-Apr-2025 Engagement & Project Overview We have multiple Data applications under Financial Accounting span, which are planned to migrate from on-prem (Mainframe or DataStage) to Azure Cloud. Primary Responsibilities Should have relevant Experience at least 8 to 12 years of experience in Data Engineering technologies Should be able to Design, implement and maintain Data applications across all phases of Software Development Should be able to interact with business for requirements and convert into design documents Good to have Healthcare domain knowledge Strong analytical and problem-solving skills Well versed with agile processes and open to work on application support and Flexi working hours Must Have Skills Azure Data Factory (ADF), Databricks & PySpark Should have experience in any Databases Excellent Communication Nice To Have Skills Snowflake Healthcare domain knowledge Any cloud platforms, preferably Azure
Role & responsibilities Job Summary The Senior Technical Consultant in Workday (Technical) Workday Studio and Workday (Technical) EIB is responsible for providing technical expertise related to Workday Studio and Workday Enterprise Interface Builder (EIB). The role involves configuring, customizing, and optimizing the Workday platform to meet the organization's business needs effectively. (1.) Key Responsibilities 1. Collaborate with stakeholders to understand business requirements and translate them into technical solutions within the workday platform. 2. Develop, implement, and maintain workday studio integrations for data imports, exports, and transformations. 3. Design and build eib solutions to streamline data migration and integration processes. 4. Provide technical support and troubleshooting for workday studio and eib configurations. 5. Ensure data accuracy, data integrity, and system performance within the workday environment. 6. Stay updated on workday platform enhancements and recommend best practices for system optimization. Skill Requirements 1. Proficiency in configuring and customizing workday studio for developing integrations. 2. Strong understanding of workday enterprise interface builder (eib) for data migration and integration tasks. 3. Knowledge of workday technical tools and methodologies for effective system enhancement. 4. Ability to collaborate with cross functional teams and communicate technical solutions effectively. 5. Analytical mindset with problem-solving skills to address complex technical challenges within the workday platform. Certifications: Workday Studio and Workday Integration Cloud certifications are preferred. Skill (Primary): HCM(Apps)-Cloud based Applications-HCM-Technical-Workday (Technical) - Workday Studio Preferred candidate profile
Job Summary The Senior Technical Lead in DevOps Solutions, Kubernetes, Azure DevOps is responsible for overseeing the technical aspects of implementing and managing DevOps solutions while leveraging Kubernetes and Azure platform. The role involves leading a team to ensure the successful deployment and maintenance of automation, CI/CD pipelines, and cloud infrastructure. (1.) Key Responsibilities 1. Lead the design, implementation, and maintenance of devops solutions, leveraging kubernetes and azure devops tools. 2. Manage and mentor a team of devops engineers to ensure efficient delivery and operation of software projects. 3. Collaborate with development and operations teams to streamline processes and improve deployment pipelines. 4. Implement best practices for infrastructure as code, configuration management, and monitoring in a cloud environment. 5. Troubleshoot and resolve technical issues related to devops tools, kubernetes clusters, and azure services. 6. Stay updated on industry trends and technologies to recommend enhancements and optimizations for the devops environment. Skill Requirements 1. Strong expertise in devops solutions, kubernetes, and azure devops tools. 2. Proficiency in designing and implementing ci/cd pipelines for automated software delivery. 3. Experience with containerization technologies such as docker and container orchestration with kubernetes. 4. Deep understanding of cloud services on azure, including resource management, networking, security, and monitoring. 5. Solid scripting skills (e.g., powershell, bash) for automation and configuration management tasks. 6. Ability to lead and mentor a team, fostering collaboration and innovation in a fast paced environment. 7. Excellent problem-solving skills and the ability to communicate technical concepts effectively. Certifications: Relevant certifications in DevOps, Kubernetes, and Azure (e.g., Certified Kubernetes Administrator, Microsoft Certified: Azure DevOps Engineer Expert) would be a plus. Skill (Primary) Modern Application Development-DevOps-API-Kubernetes Role & responsibilities Preferred candidate profile
Role & responsibilities Preferred candidate profile
Key Responsibilities Lead and participate in design, configuration, testing, and deployment of SAP FICO solutions. Collaborate with finance stakeholders to gather requirements and provide expert guidance. Identify opportunities for system enhancements and process improvements. Provide support for SAP FICO modules, including incident and problem resolution. Manage or support finance-related SAP projects with quality and timeliness. Prepare and maintain documentation such as functional specifications, test scripts, and user manuals. Conduct user training and ensure adoption of SAP functionalities. Work on integrations with SAP modules (SD, MM, PS, HR) and non-SAP systems. Ensure compliance with internal policies and external regulations. Required Qualifications & Experience Bachelors degree in Finance, Accounting, IT, Computer Science, or related field. 6+ years of SAP FICO experience covering GL, AP, AR, AA, CCA, PCA, Internal Orders, New GL, Cash Management (preferred). Experience with PS integration, CO-PA, and Product Costing preferred. 2–3+ full lifecycle SAP FICO implementations. Strong configuration, problem-solving, and analytical skills. Excellent English communication (verbal & written). Ability to work both independently and in global teams. Strong stakeholder management skills. Desirable Skills (Bonus Points) S/4HANA Finance SAP BPC (Business Planning & Consolidation) Treasury modules (TRM) Oil & Gas / Energy industry background SAP Certifications (FICO or S/4HANA)
Job Title: Big Data Scala Developer Experience: 59 Years Job Description: We are seeking a skilled Big Data Scala Developer with strong expertise in building scalable data solutions. The ideal candidate will have hands-on experience in Scala, PySpark, Hive/SQL, and Kafka, with proven ability to work in large-scale data processing environments. Strong communication skills and prior exposure to banking/financial projects are highly desirable. Key Responsibilities: Design, develop, and optimize data pipelines using Scala and PySpark. Work with Hive/SQL to manage and query large datasets. Implement real-time data streaming solutions with Kafka. Collaborate with cross-functional teams to deliver high-quality, scalable solutions. Ensure adherence to coding standards and performance best practices. Required Skills: Proficiency in Scala and PySpark (core development & optimization). Strong knowledge of Hive/SQL for big data querying and transformation. Experience in Kafka for data streaming and event-driven processing. Excellent communication skills to interact with technical and business stakeholders. Experience working in banking or financial domain projects is a plus.
JD: Cloud Infrastructure Design & Implementation: Design, deploy, and manage scalable, highly available, and fault-tolerant cloud infrastructure on GCP using services such as Compute Engine, Google Kubernetes Engine (GKE), Cloud SQL, Cloud Storage, VPC, Cloud Load Balancing, and Cloud DNS. Implement Infrastructure as Code (IaC) using tools like Terraform or Google Cloud Deployment Manager to automate the provisioning and management of GCP resources. Develop and maintain robust CI/CD pipelines for deploying applications and infrastructure to GCP. Platform Operations & Optimization: Monitor, troubleshoot, and optimize the performance, cost, and availability of GCP resources and applications using Google Cloud's operations suite (Cloud Monitoring, Cloud Logging, Cloud Trace, Cloud Profiler). Implement and manage disaster recovery and backup strategies for cloud-based systems. Identify and implement cost-optimization strategies for GCP resources. Security & Compliance: Implement and enforce security best practices within GCP environments, including Identity and Access Management (IAM), network security policies, data encryption, and vulnerability management. Ensure compliance with relevant industry standards and internal security policies. Collaboration & Support: Collaborate closely with development, DevOps, and architecture teams to define cloud solutions, integrate applications, and ensure adherence to best practices. Provide technical expertise and support for cloud-based applications, troubleshooting issues, and ensuring timely resolution. Document cloud infrastructure, configurations, and operational procedures. Innovation & Continuous Improvement: Stay up-to-date with the latest GCP services, features, and best practices. Proactively identify opportunities for automation, efficiency improvements, and innovation within the cloud environment. Required Skills & Qualifications: Education: Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field (or equivalent practical experience). Experience: 5+ years of hands-on experience designing, deploying, and managing solutions on Google Cloud Platform. GCP Expertise: In-depth knowledge and hands-on experience with core GCP services (e.g., Compute Engine, GKE, Cloud SQL, Cloud Storage, VPC, IAM, Cloud Functions, Cloud Pub/Sub). Strong understanding of cloud architecture principles (scalability, reliability, security, cost optimization). Infrastructure as Code (IaC): Proficiency with Terraform for managing GCP infrastructure. Experience with configuration management tools (e.g., Ansible, Puppet, Chef) is a plus. Containerization & Orchestration: Strong understanding of Docker and Kubernetes. Hands-on experience with Google Kubernetes Engine (GKE). Scripting & Automation: Proficiency in at least one scripting language (e.g., Python, Bash, Go). Experience with CI/CD tools (e.g., Cloud Build, Jenkins, GitLab CI). Networking: Solid understanding of networking concepts (VPC, subnets, firewall rules, load balancing, DNS) within a cloud environment. Security: Knowledge of cloud security best practices, identity and access management, and data protection. Monitoring & Logging: Experience with GCP's operations suite (Cloud Monitoring, Cloud Logging) for observability and troubleshooting. Problem-Solving: Excellent analytical and problem-solving skills with a strong attention to detail. Communication: Strong verbal and written communication skills, with the ability to collaborate effectively with cross-functional teams.
JD: Detailed JD BASIC QUALIFICATIONS Bachelors degree in Computer Science or a related discipline, or equivalent industry experience 3+ years of software development experience 5+ years of experience in CAD/Creo suite of products. Strong Computer Science fundamentals and problem solving skills Experience in coding with C, Java Experience in Design, development (coding), implementation and support of Creo customizations and workflows. Design and code/implement Creo customizations using JLink and Proe Toolkit Experience on Creo Versions 8 & 11 Experience with Creo configurations, template libraries, start parts, drawings. PREFERRED QUALIFICATIONS Expertise on Windchill EPM/Doc/Part/BOM management, version control, change management and filevault configurations Master’s degree in Computer Science or a related discipline 7+ years of work experience in CAD/PDM systems
Job Summary We are looking for an experienced UI Developer with strong expertise in React.js to join our team. The ideal candidate will be responsible for building intuitive, scalable, and high-performing user interfaces. You will work closely with product managers, UX designers, and backend engineers to deliver seamless user experiences. Key Responsibilities Develop, enhance, and maintain user-facing features using React.js and modern JavaScript frameworks. Collaborate with UX/UI designers to translate design wireframes and prototypes into responsive web applications. Write clean, reusable, and efficient code following best practices. Optimize applications for maximum speed and scalability . Integrate frontend components with RESTful APIs/GraphQL. Conduct thorough code reviews and provide constructive feedback to peers. Stay updated with the latest frontend technologies and suggest improvements. Debug and resolve UI/UX-related issues across different browsers and devices. Required Skills & Qualifications 4+ years of proven experience in frontend/UI development. Strong proficiency in React.js , JavaScript (ES6+) , HTML5 , and CSS3 . Experience with state management libraries (Redux, Context API, MobX, etc.). Good understanding of RESTful APIs , JSON , and asynchronous programming . Hands-on experience with version control systems (Git, GitHub/Bitbucket) . Experience in responsive design and cross-browser compatibility . Knowledge of build tools like Webpack, Babel, NPM/Yarn. Familiarity with UI testing frameworks (Jest, React Testing Library, Cypress) is a plus. Strong problem-solving skills and attention to detail. Good to Have Experience with TypeScript . Exposure to Next.js or other SSR frameworks. Knowledge of Agile/Scrum methodologies. Understanding of CI/CD pipelines and deployment processes.
JD: Overall experience needed 7-9 years. Intermediate expert level Should be good in Java Springboot and Structs Ability create API’s on APIGEE platform. Azure skillset with AZ900 certification Should have hands-on experience in building microservices , integrating with different systems using JDBC, File, HTTP, SOAP/XSD, REST, JSON Should be well-versed with JavaScript/Groovy/Python. Understanding and exposure on security concepts like SSL, Basic Authentication and OAUTH Confluent Kafka & kstreams understanding Worked with relational databases, knows basic SQL development Should have worked through Waterfall or Agile Software Development life cycle Should have exposure on using CI/CD tools like Jenkins, GitHub, etc Understanding of IT Service Management Strong communication skills (written and verbal) and team player