Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0 years
0 Lacs
Gurugram, Haryana, India
Remote
IMEA (India, Middle East, Africa) India LIXIL INDIA PVT LTD Employee Assignment Fully remote possible Full Time 2 June 2025 Sr. Data Migration Specialist Purpose Global Master Data Governance (GMDG) function is created to lead data governance of master data of LIXIL’s global operations through an implementation of transformation projects and an operation of day-to-day data management Data Migration Analyst is responsible for executing data migration for transformation projects under GMDG. For the immediate term, this roles should contribute to SAP ECC to SAP S/4 HANA transition across multiple years for multiple regions Responsibilities Engage in a data migration project for transitioning master data and transactional data from SAP ECC to SAP S/4 HANA, encompassing all plants and sales organizations Execute data migration activities in alignment with the four SAP ERP implementation phases: SIT1, SIT2, UAT, and Cut-over Leverage expertise in SAP ERP master data (Material, Business Partner, and Finance), logistics data, and finance data to ensure accurate data migration. Utilize technologies such as SAP BODS, SAP LTMC, and SAP LSMW to effectively and accurately transform and migrate data across systems Communicate with stakeholders responsible for each data element to understand business requirements and to review data migration quality Execute data migration timely and accurately according the data migration master plan under guidance and supervision of specialists Collaborate and contribute in agile work environment by assuming roles in Scrum framework and by utilizing tools such as Atlassian Jira and Confluence Job Requirements Extensive experience in ERP data migration, particularly from SAP ECC to SAP S/4 HANA Knowledge of master data (Material, Business Partner, and Finance), secondary master data, and transactional data in ERP systems Proficiency in SAP BODS, SAP LTMC, and SAP LSMW, with demonstrated ability to conduct problem-solving in data migration Experience in analyzing large scale data and communicating findings with stakeholders Familiarity with agile methodologies (e.g., Scrum) Excellent communication skills to gather business requirements and translate them into data migration solutions Show more Show less
Posted 6 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, deliver digital marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them to save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agrifinance, insurance, and many more industry segments. We invest in talented people and new advanced technologies to unlock the power of data and to innovate. A FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 23,300 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com Job Description What you'll do Architect, build, document, and maintain Cloud standards and processes Lead projects and new application implementations Create new Terraform architecture and modules to provision AWS resources Create, manage, and administrate Kubernetes running on EKS Create and modify Jenkins pipelines to support CI and automation Work with Software Development teams to write and tune their application Helm charts for EKS Performance Engineering, load testing, hotspot isolation, and remediation Guide teams on best practices in the cloud POC new solutions and production in the cloud Configure APM, SLO, SLA and alerting through Dynatrace Configure log metrics and analysis through Splunk Build and manage CI deployment process for all environments Support and allow teams to migrate from on-prem environments into AWS You will be reporting to a Senior Manager You will have to WFO 2 days a week as its Hybrid working from Hyderabad Required Soft Experience Experience leading application migrations into the cloud according to best practices, standards and cloud-native architecture. You understand the challenges and trade-offs to be made when building and deploying systems to production. Expertise in working with container deployment and orchestration technologies at scale with strong knowledge of the fundamentals to include service discovery, deployments, monitoring, scheduling, load balancing. Experience with identifying performance bottlenecks, identifying anomalous system behavior, and determining the root cause of incidents. 5+ years of experience working with APM and log aggregation tools as well as configuring the integrations and monitoring needed to leverage these tools. Interest in designing, analyzing, and troubleshooting large-scale distributed systems. Qualifications Required Technical Experience 5+ years expert level experience Terraform 5+ years expert level experience with AWS services EC2, ASG, SG, ALB/NLB/WAF, ACL, Routing, Route53, Express Connect/Transit Gateway, EC2 Image Builder, EKS, ECS, ECR, Lambda 5+ years experienced writing Jenkins files and Jenkins Shared Libraries 5+ years expert level with EKS creation and administration 5+ years expert level with Kubernetes application deployment and management Experienced writing and maintaining custom application Helm charts and Helm template libraries Additional Information Our uniqueness is that we truly celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people first approach is award winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here Show more Show less
Posted 6 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are efficient, scalable, and aligned with business objectives. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with stakeholders to gather and analyze data requirements. - Design and implement robust data pipelines to support data processing and analytics. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of data modeling and database design principles. - Experience with ETL tools and data integration techniques. - Familiarity with cloud platforms and services related to data storage and processing. - Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. Show more Show less
Posted 6 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the design and implementation of data architecture to support data initiatives. - Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality. - Strong understanding of data integration techniques and ETL processes. - Experience with data profiling and data cleansing methodologies. - Familiarity with database management systems and SQL. - Knowledge of data governance and data quality best practices. Additional Information: - The candidate should have minimum 3 years of experience in Informatica Data Quality. - This position is based at our Hyderabad office. - A 15 years full time education is required. Show more Show less
Posted 6 days ago
3.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Title: Sales Executive (Target-Based) Company: Visafast Migration Consultancy Location: Ahmedabad Experience Required: 2–3 years (preferably in sales, telecalling, or migration consultancy) Salary: ₹40,000 – ₹50,000/month + 5% Bonus on Target Sales (only upon full target achievement) Employment Type: Full-time (Mon - Sat) Company Description Visafast Migration Consultancy, founded in 2000, offers expert immigration advice and visa application services for individuals looking to migrate to Australia & Canada. Registered with MARA (Migration Agents Registration Authority), our consultants excel in assisting clients with all migration queries and creating the right pathway for relocation. Role Description This is a full-time on-site Sales Executive role located in Ahmedabad at Visafast Migration Consultancy. The Sales Executive will be responsible for meeting sales targets, building relationships with clients, providing immigration advice, and assisting clients with visa applications and migration processes. Visafast Migration Consultancy is seeking target-driven Sales Executives with 2–3 years of experience to join our dynamic team. The role involves calling and converting leads from our internal database into paying clients for our visa and migration services. Qualifications Sales and Negotiation skills Excellent Communication and Interpersonal skills Customer Relationship Management expertise Knowledge of Immigration laws and processes Ability to work in a fast-paced environment Experience in the migration or consultancy industry is a plus Bachelor's degree in Business Administration or related field Key Responsibilities: Handle and convert leads from our CRM database into paying clients. Make outbound calls to potential leads and explain migration/visa services offered. Follow-up regularly and maintain relationships with interested prospects. Achieve and exceed monthly sales targets as defined by management. Update and manage daily call logs, follow-up status, and client conversion details. Coordinate with the migration team to ensure smooth onboarding of new clients. Compensation & Benefits: Fixed Salary: ₹40,000 – ₹50,000/month (based on experience and performance). Incentives: 5% bonus on total sales value, applicable only upon achieving 100% of the assigned monthly target. Growth opportunities within the organization. Training and ongoing support provided. How to Apply: Email your CV to pratik@visafast.com.au with the subject line: Application for Sales Executive – Visafast Show more Show less
Posted 6 days ago
3.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Growexx is seeking a Ruby on Rails Engineer to lead the support and modernization. These applications are critical to operational workflows and require a thoughtful balance of stability, performance, and incremental modernization. Key Responsibilities Own the full lifecycle of legacy Ruby on Rails applications, from maintenance to feature development and modernization Collaborate with cross-functional teams, including QA, DevOps, and product management, to deliver high-quality software Refactor and modularize the legacy codebase to improve maintainability and scalability Enhance test coverage and implement automated testing strategies Lead efforts to migrate legacy components to modern architectures where appropriate. Monitor application performance and proactively address technical debt and bottlenecks Mentor junior developers and contribute to a culture of engineering excellence Key Skills Strong understanding of MVC architecture, RESTful APIs, and service-oriented design Proficiency with relational databases (PostgreSQL preferred) and background job framework (e.g., Sidekiq, Resque) Experience with version control (Git), CI/CD pipelines, and deployment automation Ability to work with legacy codebases and incrementally modernize them Preferred Experience with AIC (Asset Inspection & Compliance) or Station Check systems Familiarity with front-end frameworks (e.g., React, Vue) for hybrid Rails applications Knowledge of security best practices in web application development Education and Experience 3+ years of experience developing and maintaining Ruby on Rails applications Analytical and Personal skills Must have good logical reasoning and analytical skills Ability to break big goals into small incremental actions Excellent Communication skills in English – both written and verbal Demonstrate Ownership and Accountability of their work Great attention to detail Self-Criticizing Demonstrate ownership of tasks Positive and Cheerful outlook in life Show more Show less
Posted 6 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role- Sharepoint Developer Location- Gurgaon Description Key Skills: Project Online Development : Design, develop, and maintain solutions using Microsoft Project Online, ensuring alignment with business processes and project management methodologies. Hands-on expertise with Microsoft Project Online and Dataverse (customization, administration, reporting). Build custom connectors, automate project workflows, and sync data with Dataverse. REST API/ODATA Integration (Dataverse, SharePoint, Project Online). Postman/Swagger for API testing and documentation. Power Platform Expertise Model-Driven App Development (UI, business logic, entity relationships). Dataverse Configuration (Tables, Security Roles, Business Rules, Plugins). Power Automate (Cloud Flows, API Integration, Custom Connectors). Data Flows (ETL Processes, Data Integration with Azure/Dataverse). SharePoint Development Front-End Customization (HTML/CSS/JavaScript). Integration with Dataverse (APIs, Power Automate, Custom Connectors). Project Online Backend APIs : REST API/CSOM (Client-Side Object Model) for Project Online data manipulation. Integration with Power Platform (e.g., fetching project data into Dataverse). SharePoint Lists, Libraries, and Workflows. Roles & Responsibilities Solution Development Design and deploy model-driven apps using Dataverse for business process automation. Build SharePoint solutions (forms, web parts, dashboards) integrated with Dataverse and Project Online data . Develop Power Automate workflows to connect Power Platform, SharePoint, Project Online, and external systems. Integration & Data Management Implement seamless data flow between SharePoint, Dataverse, and Project Online using APIs (REST/CSOM) and JavaScript. Use Project Online APIs to automate project management tasks (e.g., resource allocation, timeline updates). Migrate and synchronize data across platforms (e.g., Project Online tasks to Dataverse tables). API Development & Customization Build custom connectors for Project Online APIs to enable data access in Power Apps/Power Automate. Securely authenticate and fetch data from Project Online using Azure AD and OAuth. Optimize API performance and troubleshoot integration issues (e.g., rate limits, data mapping). Customization & Optimization Enhance SharePoint/Project Online interfaces with HTML/CSS/JavaScript Optimize Dataverse performance and ensure data consistency with Project Online. Debug API-related errors in Power Platform or SharePoint workflows. Show more Show less
Posted 6 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India . Minimum qualifications: Bachelor’s degree in Engineering, Computer Science, a related field, or equivalent practical experience. Experience coding with one or more programming languages (e.g., Java, C/C++, Python). Experience troubleshooting technical issues for internal/external partners or customers. Preferred qualifications: Experience in distributed data processing frameworks and modern age investigative and transactional data stores. Experience in working with/on data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools, environments, and data structures. Experience in big data, information retrieval, data mining. Experience in building multi-tier, high availability applications with modern technologies such as NoSQL, MongoDB. Experience with Infrastructure as Code (IaC) and Continuous Integration/Continuous Delivery (CICD) tools like Terraform, Ansible, Jenkins etc. Understanding of at least one database type with the ability to write complex SQLs. About the job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Strategic Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze, and explore/visualize data on the Google Cloud Platform. You will work on data migrations and modernization projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform/product challenges. You will have an in-depth understanding of data governance and security controls. You will travel to customer sites to deploy solutions and deliver workshops to educate and empower customers. Additionally, you will work closely with Product Management and Product Engineering teams to build and constantly drive excellence in our products.Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Interact with stakeholders to translate complex customer requirements into recommendations for appropriate solution architectures and advisory services. Engage with technical leads, and partners to lead high velocity migration and modernization to Google Cloud Platform (GCP). Design, Migrate/Build and Operationalize data storage and processing infrastructure using Cloud native products. Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data. Take various project requirements and organize them into clear goals and objectives, and create a work breakdown structure to manage internal and external stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less
Posted 6 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India . Minimum qualifications: Bachelor’s degree in Engineering, Computer Science, a related field, or equivalent practical experience. Experience coding with one or more programming languages (e.g., Java, C/C++, Python). Experience troubleshooting technical issues for internal/external partners or customers. Preferred qualifications: Experience in distributed data processing frameworks and modern age investigative and transactional data stores. Experience in working with/on data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools, environments, and data structures. Experience in big data, information retrieval, data mining. Experience in building multi-tier, high availability applications with modern technologies such as NoSQL, MongoDB. Experience with Infrastructure as Code (IaC) and Continuous Integration/Continuous Delivery (CICD) tools like Terraform, Ansible, Jenkins etc. Understanding of at least one database type with the ability to write complex SQLs. About the job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Strategic Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze, and explore/visualize data on the Google Cloud Platform. You will work on data migrations and modernization projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform/product challenges. You will have an in-depth understanding of data governance and security controls. You will travel to customer sites to deploy solutions and deliver workshops to educate and empower customers. Additionally, you will work closely with Product Management and Product Engineering teams to build and constantly drive excellence in our products.Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Interact with stakeholders to translate complex customer requirements into recommendations for appropriate solution architectures and advisory services. Engage with technical leads, and partners to lead high velocity migration and modernization to Google Cloud Platform (GCP). Design, Migrate/Build and Operationalize data storage and processing infrastructure using Cloud native products. Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data. Take various project requirements and organize them into clear goals and objectives, and create a work breakdown structure to manage internal and external stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less
Posted 6 days ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India; Gurgaon, Haryana, India; Pune, Maharashtra, India . Minimum qualifications: Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field, or equivalent practical experience. 4 years of experience in developing and troubleshooting data processing algorithms. Experience coding with one or more programming languages (e.g., Java, Python) and Bigdata technologies such as Scala, Spark and hadoop frameworks. Experience with one public cloud provider, such as GCP. Preferred qualifications: Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments. Experience in Big Data, information retrieval, data mining, or Machine Learning. Experience with data warehouses, technical architectures, infrastructure components, Extract Transform and Load/Extract, Load and Transform and reporting/analytic tools, environments, and data structures. Experience in building multi-tier applications with modern technologies such as NoSQL, MongoDB, SparkML, and TensorFlow. Experience with Infrastructure as Code and Continuous Integration/Continuous Deployment tools like Terraform, Ansible, Jenkins. Understanding one database type, with the ability to write complex SQL queries. About The Job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Strategic Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze, and explore/visualize data on the Google Cloud Platform. You will work on data migrations and modernization projects, and with customers to design data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform/product challenges. You will have an understanding of data governance and security controls. You will travel to customer sites to deploy solutions and deliver workshops to educate and empower customers. Additionally, you will work with Product Management and Product Engineering teams to build and constantly drive excellence in our products. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Interact with stakeholders to translate complex customer requirements into recommendations for appropriate solution architectures and advisory services. Engage with technical leads, and partners to lead high velocity migration and modernisation to Google Cloud Platform (GCP). Design, Migrate/Build and Operationalise data storage and processing infrastructure using Cloud native products. Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data. Take various project requirements and organize them into clear goals and objectives, and create a work breakdown structure to manage internal and external stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less
Posted 6 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Gurugram, Haryana, India; Bengaluru, Karnataka, India; Hyderabad, Telangana, India . Minimum qualifications: Bachelor’s degree in Engineering, Computer Science, a related field, or equivalent practical experience. Experience coding with one or more programming languages (e.g., Java, C/C++, Python). Experience troubleshooting technical issues for internal/external partners or customers. Preferred qualifications: Experience in distributed data processing frameworks and modern age investigative and transactional data stores. Experience in working with/on data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools, environments, and data structures. Experience in big data, information retrieval, data mining. Experience in building multi-tier, high availability applications with modern technologies such as NoSQL, MongoDB. Experience with Infrastructure as Code (IaC) and Continuous Integration/Continuous Delivery (CICD) tools like Terraform, Ansible, Jenkins etc. Understanding of at least one database type with the ability to write complex SQLs. About the job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Strategic Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze, and explore/visualize data on the Google Cloud Platform. You will work on data migrations and modernization projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform/product challenges. You will have an in-depth understanding of data governance and security controls. You will travel to customer sites to deploy solutions and deliver workshops to educate and empower customers. Additionally, you will work closely with Product Management and Product Engineering teams to build and constantly drive excellence in our products.Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Interact with stakeholders to translate complex customer requirements into recommendations for appropriate solution architectures and advisory services. Engage with technical leads, and partners to lead high velocity migration and modernization to Google Cloud Platform (GCP). Design, Migrate/Build and Operationalize data storage and processing infrastructure using Cloud native products. Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data. Take various project requirements and organize them into clear goals and objectives, and create a work breakdown structure to manage internal and external stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less
Posted 6 days ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India; Gurgaon, Haryana, India; Pune, Maharashtra, India . Minimum qualifications: Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field, or equivalent practical experience. 4 years of experience in developing and troubleshooting data processing algorithms. Experience coding with one or more programming languages (e.g., Java, Python) and Bigdata technologies such as Scala, Spark and hadoop frameworks. Experience with one public cloud provider, such as GCP. Preferred qualifications: Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments. Experience in Big Data, information retrieval, data mining, or Machine Learning. Experience with data warehouses, technical architectures, infrastructure components, Extract Transform and Load/Extract, Load and Transform and reporting/analytic tools, environments, and data structures. Experience in building multi-tier applications with modern technologies such as NoSQL, MongoDB, SparkML, and TensorFlow. Experience with Infrastructure as Code and Continuous Integration/Continuous Deployment tools like Terraform, Ansible, Jenkins. Understanding one database type, with the ability to write complex SQL queries. About The Job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Strategic Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze, and explore/visualize data on the Google Cloud Platform. You will work on data migrations and modernization projects, and with customers to design data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform/product challenges. You will have an understanding of data governance and security controls. You will travel to customer sites to deploy solutions and deliver workshops to educate and empower customers. Additionally, you will work with Product Management and Product Engineering teams to build and constantly drive excellence in our products. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Interact with stakeholders to translate complex customer requirements into recommendations for appropriate solution architectures and advisory services. Engage with technical leads, and partners to lead high velocity migration and modernisation to Google Cloud Platform (GCP). Design, Migrate/Build and Operationalise data storage and processing infrastructure using Cloud native products. Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data. Take various project requirements and organize them into clear goals and objectives, and create a work breakdown structure to manage internal and external stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less
Posted 6 days ago
4.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India; Gurgaon, Haryana, India; Pune, Maharashtra, India . Minimum qualifications: Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field, or equivalent practical experience. 4 years of experience in developing and troubleshooting data processing algorithms. Experience coding with one or more programming languages (e.g., Java, Python) and Bigdata technologies such as Scala, Spark and hadoop frameworks. Experience with one public cloud provider, such as GCP. Preferred qualifications: Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments. Experience in Big Data, information retrieval, data mining, or Machine Learning. Experience with data warehouses, technical architectures, infrastructure components, Extract Transform and Load/Extract, Load and Transform and reporting/analytic tools, environments, and data structures. Experience in building multi-tier applications with modern technologies such as NoSQL, MongoDB, SparkML, and TensorFlow. Experience with Infrastructure as Code and Continuous Integration/Continuous Deployment tools like Terraform, Ansible, Jenkins. Understanding one database type, with the ability to write complex SQL queries. About The Job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Strategic Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze, and explore/visualize data on the Google Cloud Platform. You will work on data migrations and modernization projects, and with customers to design data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform/product challenges. You will have an understanding of data governance and security controls. You will travel to customer sites to deploy solutions and deliver workshops to educate and empower customers. Additionally, you will work with Product Management and Product Engineering teams to build and constantly drive excellence in our products. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Interact with stakeholders to translate complex customer requirements into recommendations for appropriate solution architectures and advisory services. Engage with technical leads, and partners to lead high velocity migration and modernisation to Google Cloud Platform (GCP). Design, Migrate/Build and Operationalise data storage and processing infrastructure using Cloud native products. Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data. Take various project requirements and organize them into clear goals and objectives, and create a work breakdown structure to manage internal and external stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less
Posted 6 days ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India; Gurgaon, Haryana, India; Pune, Maharashtra, India . Minimum qualifications: Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field, or equivalent practical experience. 4 years of experience in developing and troubleshooting data processing algorithms. Experience coding with one or more programming languages (e.g., Java, Python) and Bigdata technologies such as Scala, Spark and hadoop frameworks. Experience with one public cloud provider, such as GCP. Preferred qualifications: Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments. Experience in Big Data, information retrieval, data mining, or Machine Learning. Experience with data warehouses, technical architectures, infrastructure components, Extract Transform and Load/Extract, Load and Transform and reporting/analytic tools, environments, and data structures. Experience in building multi-tier applications with modern technologies such as NoSQL, MongoDB, SparkML, and TensorFlow. Experience with Infrastructure as Code and Continuous Integration/Continuous Deployment tools like Terraform, Ansible, Jenkins. Understanding one database type, with the ability to write complex SQL queries. About The Job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Strategic Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze, and explore/visualize data on the Google Cloud Platform. You will work on data migrations and modernization projects, and with customers to design data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform/product challenges. You will have an understanding of data governance and security controls. You will travel to customer sites to deploy solutions and deliver workshops to educate and empower customers. Additionally, you will work with Product Management and Product Engineering teams to build and constantly drive excellence in our products. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Interact with stakeholders to translate complex customer requirements into recommendations for appropriate solution architectures and advisory services. Engage with technical leads, and partners to lead high velocity migration and modernisation to Google Cloud Platform (GCP). Design, Migrate/Build and Operationalise data storage and processing infrastructure using Cloud native products. Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data. Take various project requirements and organize them into clear goals and objectives, and create a work breakdown structure to manage internal and external stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less
Posted 6 days ago
5.0 years
5 - 8 Lacs
Hyderābād
On-site
Category: Software Development/ Engineering Main location: India, Andhra Pradesh, Hyderabad Position ID: J0625-0219 Employment Type: Full Time Position Description: Company Profile: Founded in 1976, CGI is among the largest independent IT and business consulting services firms in the world. With 94,000 consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Your future duties and responsibilities: Position: Senior Software Engineer Experience: 5-10 years Category: Software Development/ Engineering Shift Timings: 1:00 pm to 10:00 pm Main location: Hyderabad Work Type: Work from office Skill: Spark (PySpark), Python and SQL Employment Type: Full Time Position ID: J0625-0219 Required qualifications to be successful in this role: Must have Skills: 5+ yrs. Development experience with Spark (PySpark), Python and SQL. Extensive knowledge building data pipelines Hands on experience with Databricks Devlopment Strong experience with Strong experience developing on Linux OS. Experience with scheduling and orchestration (e.g. Databricks Workflows,airflow, prefect, control-m). Good to have skills: Solid understanding of distributed systems, data structures, design principles. Agile Development Methodologies (e.g. SAFe, Kanban, Scrum). Comfortable communicating with teams via showcases/demos. Play key role in establishing and implementing migration patterns for the Data Lake Modernization project. Actively migrate use cases from our on premises Data Lake to Databricks on GCP. Collaborate with Product Management and business partners to understand use case requirements and reporting. Adhere to internal development best practices/lifecycle (e.g. Testing, Code Reviews, CI/CD, Documentation) . Document and showcase feature designs/workflows. Participate in team meetings and discussions around product development. Stay up to date on industry latest industry trends and design patterns. 3+ years experience with GIT. 3+ years experience with CI/CD (e.g. Azure Pipelines). Experience with streaming technologies, such as Kafka, Spark. Experience building applications on Docker and Kubernetes. Cloud experience (e.g. Azure, Google). Skills: English Python SQLite What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 6 days ago
0 years
0 Lacs
Hyderābād
Remote
Req ID: 328599 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a BODS Developer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Understand and execute data migration blueprints (migration concepts, transformation rules, mappings, selection criteria) Understand and contribute to the documentation of the data mapping specifications, conversion rules, technical design specifications as required Build the conversion processes and associated programs that will migrate the data per the design and conversion rules that have been signed-off by the client Execution of all data migration technical steps (extract, transform & load) as well as Defect Management and Issue Resolution Perform data load activities for each mock load, cutover simulation and production deployment identified in L1 plan into environments identified Provide technical support, defect management, and issue resolution during all testing cycles, including Mock Data Load cycles Complete all necessary data migration documentation necessary to support system validation / compliance requirements Support the development of unit and end-to-end data migration test plans and test scripts (including testing for data extraction, transformation, data loading, and data validation) Job Requirements Work in shift from 3 PM to 12 AM IST with Work from home option 4-6 Yrs. of overall technical experience in SAP BODS with all the SAP BODS application modules (Extract, Transform, Load) 2-4 Yrs. of experience with Data Migration experience with S/4 HANA/ECC Implementations Experience in BODS Designer Components- Projects, Jobs, Workflow, Data Flow, Scripts, Data Stores and Formats Experience in BODS performance tuning techniques using parallel processing (Degree of Parallelism), Multithreading, Partitioning, and Database Throughputs to improve job performance Experience in ETL using SAP BODS and SAP IS with respect to SAP Master / Transaction Data Objects in SAP FICO, SAP SD, SAP MM/WM, SAP Plant Maintenance, SAP Quality Management etc. is desirable Experience with Data Migration using LSMW, IDOCS, LTMC Ability to Debug LTMC errors is highly desirable About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 6 days ago
3.0 years
3 - 5 Lacs
Hyderābād
On-site
You belong to the top echelon of talent in your field. At one of the world's most iconic financial institutions, where infrastructure is of paramount importance, you can play a pivotal role. As an Infrastructure Engineer III at JPMorgan Chase within the Infrastructure Platforms team, you utilize strong knowledge of software, applications, and technical processes within the infrastructure engineering discipline. Apply your technical knowledge and problem-solving methodologies across multiple applications of moderate scope. Job responsibilities Provide technical support for applications, ensuring high availability and performance. Troubleshoot and resolve application issues, working closely with development and operations teams. Monitor application performance and implement improvements to enhance user experience. Develop and maintain documentation for application support processes and procedures. Design, implement, and maintain scalable and reliable infrastructure solutions. Automate repetitive tasks and processes to improve efficiency and reduce manual intervention. Collaborate with development teams to ensure that applications are designed with reliability and scalability in mind. Implement and manage monitoring, alerting, and logging solutions to proactively identify and address issues. Analyze system performance and reliability metrics to identify areas for improvement. Drive initiatives to enhance system reliability, scalability, and performance. Stay up-to-date with industry trends and best practices in SRE and application support. Required qualifications, capabilities, and skills Formal training or certification on infrastructure disciplines concepts and 3+ years applied experience Strong knowledge of one or more infrastructure disciplines such as hardware, networking terminology, databases, storage engineering, deployment practices, integration, automation, scaling, resilience, and performance assessments Strong knowledge of one or more scripting languages (e.g., Scripting, Python, etc.) Experience with multiple cloud technologies with the ability to operate in and migrate across public and private clouds Drives to develop infrastructure engineering knowledge of additional domains, data fluency, and automation knowledge Proven experience in application support and site reliability engineering. Proficiency in scripting and automation using languages such as Python, Bash, or similar. Experience with monitoring and logging tools (e.g., Kibana, Prometheus, Grafana, ELK Stack). Infrastructure automation experience Preferred qualifications, capabilities, and skills Experience with CI/CD pipelines and DevOps practices. Good understanding of at least one Python frameworks (Django, Flask or Pyramid) Object Relational Mapping (ORM) and Relational Databases – Oracle preferred Familiarity with configuration management tools (e.g., Ansible, Terraform). Knowledge of networking and security best practices.
Posted 6 days ago
3.0 years
3 - 5 Lacs
Hyderābād
On-site
JOB DESCRIPTION You belong to the top echelon of talent in your field. At one of the world's most iconic financial institutions, where infrastructure is of paramount importance, you can play a pivotal role. As an Infrastructure Engineer III at JPMorgan Chase within the Infrastructure Platforms team, you utilize strong knowledge of software, applications, and technical processes within the infrastructure engineering discipline. Apply your technical knowledge and problem-solving methodologies across multiple applications of moderate scope. Job responsibilities Provide technical support for applications, ensuring high availability and performance. Troubleshoot and resolve application issues, working closely with development and operations teams. Monitor application performance and implement improvements to enhance user experience. Develop and maintain documentation for application support processes and procedures. Design, implement, and maintain scalable and reliable infrastructure solutions. Automate repetitive tasks and processes to improve efficiency and reduce manual intervention. Collaborate with development teams to ensure that applications are designed with reliability and scalability in mind. Implement and manage monitoring, alerting, and logging solutions to proactively identify and address issues. Analyze system performance and reliability metrics to identify areas for improvement. Drive initiatives to enhance system reliability, scalability, and performance. Stay up-to-date with industry trends and best practices in SRE and application support. Required qualifications, capabilities, and skills Formal training or certification on infrastructure disciplines concepts and 3+ years applied experience Strong knowledge of one or more infrastructure disciplines such as hardware, networking terminology, databases, storage engineering, deployment practices, integration, automation, scaling, resilience, and performance assessments Strong knowledge of one or more scripting languages (e.g., Scripting, Python, etc.) Experience with multiple cloud technologies with the ability to operate in and migrate across public and private clouds Drives to develop infrastructure engineering knowledge of additional domains, data fluency, and automation knowledge Proven experience in application support and site reliability engineering. Proficiency in scripting and automation using languages such as Python, Bash, or similar. Experience with monitoring and logging tools (e.g., Kibana, Prometheus, Grafana, ELK Stack). Infrastructure automation experience Preferred qualifications, capabilities, and skills Experience with CI/CD pipelines and DevOps practices. Good understanding of at least one Python frameworks (Django, Flask or Pyramid) Object Relational Mapping (ORM) and Relational Databases – Oracle preferred Familiarity with configuration management tools (e.g., Ansible, Terraform). Knowledge of networking and security best practices. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that we’re setting our businesses, clients, customers and employees up for success.
Posted 6 days ago
0 years
0 Lacs
Hyderābād
On-site
Req ID: 328592 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Jr BODS Developer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Jr SAP BODS Developer Position Overview Understand and execute data migration blueprints (migration concepts, transformation rules, mappings, selection criteria) Understand and contribute to the documentation of the data mapping specifications, conversion rules, technical design specifications as required Build the conversion processes and associated programs that will migrate the data per the design and conversion rules that have been signed-off by the client Execution of all data migration technical steps (extract, transform & load) as well as Defect Management and Issue Resolution Perform data load activities for each mock load, cutover simulation and production deployment identified in L1 plan into environments identified Provide technical support, defect management, and issue resolution during all testing cycles, including Mock Data Load cycles Complete all necessary data migration documentation necessary to support system validation / compliance requirements Support the development of unit and end-to-end data migration test plans and test scripts (including testing for data extraction, transformation, data loading, and data validation) Job Requirements 2-4 Yrs. of overall technical experience in SAP BODS with all the SAP BODS application modules (Extract, Transform, Load) 1-2 Yrs. of experience with Data Migration experience with S/4 HANA/ECC Implementations Experience in BODS Designer Components- Projects, Jobs, Workflow, Data Flow, Scripts, Data Stores and Formats Experience in BODS performance tuning techniques using parallel processing (Degree of Parallelism), Multithreading, Partitioning, and Database Throughputs to improve job performance Experience in ETL using SAP BODS and SAP IS with respect to SAP Master / Transaction Data Objects in SAP FICO, SAP SD, SAP MM/WM, SAP Plant Maintenance, SAP Quality Management etc. is desirable Experience with Data Migration using LSMW, IDOCS, LTMC About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 6 days ago
7.0 - 14.0 years
4 - 7 Lacs
Cochin
On-site
7 - 14 Years 1 Opening Kochi Role description We are seeking a skilled Java Consultant with strong expertise in application architecture, deployment, and cloud migration. The ideal candidate will have hands-on experience in both on-premises and cloud environments (Azure and AWS), with a solid understanding of infrastructure, application frameworks, and migration strategies. The role also involves client interaction, technical leadership, and working with modern development and cloud technologies. Key Responsibilities: Understand and analyze the different components of an application architecture. Set up and deploy applications on-premises as well as in cloud environments (Azure Or AWS). Demonstrate strong knowledge and experience with Azure or AWS IaaS and PaaS services Create comprehensive architecture diagrams to visualize applications and infrastructure. Utilize discovery tools such as PowerShell, Azure Migrate, CloudockIt, or similar for application and infrastructure discovery. Plan and execute application migration strategies, including the creation of migration playbooks. Lead client calls, conduct technical interviews, and gather requirements. Collaborate with teams using Azure DevOps, including the creation and management of user stories. Leverage existing knowledge of customer processes, infrastructure, and application environments to provide tailored solutions. Required Skills & Experience: Programming Languages & Frameworks: Proficient in Java and frameworks like Spring Boot. Experience with Java hosting stacks such as Tomcat. Containers & Orchestration: Working experience with Docker, Kubernetes, Helm charts, pods, nodes, and registries. Networking: Understanding of network infrastructure and devices, including Virtual Networks (Vnets), firewalls, Network Security Groups (NSGs), routes, and Web Application Firewalls (WAFs). Knowledge of network topologies, such as hub-and-spoke architectures Skills Java, Spring Boot, Azure About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 6 days ago
5.0 years
0 Lacs
Ahmedabad
On-site
Growexx is seeking a Senior Ruby on Rails Engineer to lead the support and modernization. These applications are critical to operational workflows and require a thoughtful balance of stability, performance, and incremental modernization. Key Responsibilities Own the full lifecycle of legacy Ruby on Rails applications, from maintenance to feature development and modernization Collaborate with cross-functional teams, including QA, DevOps, and product management, to deliver high-quality software Refactor and modularize the legacy codebase to improve maintainability and scalability Enhance test coverage and implement automated testing strategies Lead efforts to migrate legacy components to modern architectures where appropriate. Monitor application performance and proactively address technical debt and bottlenecks Mentor junior developers and contribute to a culture of engineering excellence Key Skills Strong understanding of MVC architecture, RESTful APIs, and service-oriented design Proficiency with relational databases (PostgreSQL preferred) and background job framework (e.g., Sidekiq, Resque) Experience with version control (Git), CI/CD pipelines, and deployment automation Ability to work with legacy codebases and incrementally modernize them Preferred: Experience with AIC (Asset Inspection & Compliance) or Station Check systems Familiarity with front-end frameworks (e.g., React, Vue) for hybrid Rails applications Knowledge of security best practices in web application development Education and Experience 5+ years of experience developing and maintaining Ruby on Rails applications Analytical and Personal skills Must have good logical reasoning and analytical skills Ability to break big goals into small incremental actions Excellent Communication skills in English – both written and verbal Demonstrate Ownership and Accountability of their work Great attention to detail Self-Criticizing Demonstrate ownership of tasks Positive and Cheerful outlook in life
Posted 6 days ago
12.0 years
4 - 8 Lacs
Surat
On-site
We are seeking a highly skilled and experienced Adobe Experience Cloud Solutions Architect with expertise in Adobe AEP (Adobe Experience Platform), AEM (Adobe Experience Manager), Adobe Target, and Adobe Analytics . This role is ideal for someone with a strong technical background, consulting mindset, and hands-on experience across large-scale digital transformation projects involving Adobe's marketing and experience platforms. Key Responsibilities: Lead Adobe AEP implementations including data collection strategies, schema design, and WebSDK deployments. Architect and deliver AEM solutions including cloud migrations (e.g., from Sitecore to AEM as a Cloud Service). Design and implement personalization strategies using Adobe Target across multiple enterprise web properties. Implement Adobe Analytics and migrate to CJA (Customer Journey Analytics) as part of modernization efforts. Provide subject matter expertise on Adobe Experience Cloud products for internal and client teams. Collaborate with cross-functional teams including business stakeholders to drive value from Adobe solutions. Work closely with implementation partners on delivery engagements. Required Skills & Qualifications: Hands-on experience with Adobe AEP, AEM, Target, and Adobe Analytics. Expertise in WebSDK migration and deployment. Experience with tag management and data layer design. Strong architectural knowledge of AEM Cloud Services and headless CMS integrations. Proven success in migration projects (e.g., Sitecore to AEM, AppMeasurement to WebSDK). Bachelor’s Degree in Engineering or related field. Excellent communication and stakeholder management skills. Comfortable working in agile environments and managing multiple enterprise clients. Preferred Qualifications: Best to have Adobe Certifications in AEP, AEM, or Analytics. Experience working with global enterprise clients. Experience: 12+ Years (10+ years in Adobe Experience Cloud) Job Type: Full-time (WFO Only) Email: [email protected] Job Location: Surat, India
Posted 6 days ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description: Netoyed is a CMMI Maturity Level 5 digital technology company specializing in digital transformation and product engineering services, with locations in Australia, New Zealand, the US, and India. Our expertise cuts across a number of sectors, including telecommunications, healthcare, banking and finance, and education, and our team is made up of professionals with a wide range of skills, from development and entrepreneurship to analytics and business agility. At Netoyed, we pride ourselves on delivering cutting-edge digital platforms and products that help to supercharge businesses wherever they may be. Key Responsibilities: Assess on-premise infrastructure, applications, and workloads for cloud readiness. Design and implement migration strategies (lift-and-shift, re-platforming, or modernization) to Microsoft Azure. Execute migration projects using tools such as Azure Migrate, Azure Site Recovery, and Azure Database Migration Service. Collaborate with application owners, DevOps, network, and security teams to ensure smooth migration. Monitor and troubleshoot post-migration issues and optimize performance and cost. Implement governance, security, and compliance best practices on Azure. Create technical documentation, migration runbooks, and support materials. Provide guidance, training, and handover to internal teams. Required Skills and Qualifications: 3+ years of experience in cloud migrations, specifically to Microsoft Azure. Deep understanding of Azure IaaS/PaaS services including compute, networking, identity, storage, and monitoring. Experience with Azure Active Directory, RBAC, Azure Monitor, Log Analytics, and ARM templates or Bicep. Proficiency in using migration tools like Azure Migrate, Azure Site Recovery, and third-party tools (e.g., CloudEndure, Carbonite). Familiarity with Windows and Linux servers, databases (SQL Server, MySQL, PostgreSQL), and legacy systems. Strong scripting knowledge (PowerShell, Azure CLI, or Python). Excellent problem-solving, communication, and project documentation skills. Azure certification (e.g., AZ-104, AZ-305, AZ-900, or equivalent) preferred. Show more Show less
Posted 6 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Tech Lead-UI Dev P1 C1 STS Primary Skills Angular JS, Javascript Oracle SQL API development, Microservices Java, Springboot JD Expertise in Angular (v1 to latest): Strong command of Angular core concepts including components, services, modules, reactive forms, routing, and RxJS. Angular Upgrade Experience: Ability to work on legacy AngularJS (v1.x) applications and migrate them to modern Angular versions. Micro Frontend (MFE) Exposure: Basic understanding of MFE architecture and willingness to contribute to Angular-based MFE solutions. Azure DevOps Pipelines: Experience setting up and maintaining CI/CD pipelines using Azure DevOps for Angular projects. Webpack Customization: Familiarity with Webpack and its integration with Angular CLI for optimizing builds and managing dependencies. Responsive Web Development: Proven skills in building mobile-friendly, responsive UIs using modern CSS techniques and Angular components. HTML Accessibility (a11y): Understanding of accessibility standards (WCAG) and ability to implement accessible front-end solutions. Unit Testing: Proficient in writing unit tests using frameworks like Jasmine, Karma, or Jest to maintain high code quality. Show more Show less
Posted 6 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Tech Lead-UI Dev P1 C1 STS Primary Skills Angular JS, Javascript Oracle SQL API development, Microservices Java, Springboot JD Expertise in Angular (v1 to latest): Strong command of Angular core concepts including components, services, modules, reactive forms, routing, and RxJS. Angular Upgrade Experience: Ability to work on legacy AngularJS (v1.x) applications and migrate them to modern Angular versions. Micro Frontend (MFE) Exposure: Basic understanding of MFE architecture and willingness to contribute to Angular-based MFE solutions. Azure DevOps Pipelines: Experience setting up and maintaining CI/CD pipelines using Azure DevOps for Angular projects. Webpack Customization: Familiarity with Webpack and its integration with Angular CLI for optimizing builds and managing dependencies. Responsive Web Development: Proven skills in building mobile-friendly, responsive UIs using modern CSS techniques and Angular components. HTML Accessibility (a11y): Understanding of accessibility standards (WCAG) and ability to implement accessible front-end solutions. Unit Testing: Proficient in writing unit tests using frameworks like Jasmine, Karma, or Jest to maintain high code quality. Show more Show less
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for migrate professionals in India is currently thriving, with numerous opportunities available in various industries. Whether you are just starting your career or looking to make a job transition, migrate roles can offer a rewarding career path with growth opportunities.
These cities are known for their booming IT sectors and have a high demand for migrate professionals.
The average salary range for migrate professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can command salaries upwards of INR 10-15 lakhs per annum.
A typical career path in the migrate field may involve starting as a Junior Developer, progressing to a Senior Developer, then moving up to a Tech Lead role. With experience and expertise, one could further advance to roles like Solution Architect or Project Manager.
In addition to migrate skills, professionals in this field are often expected to have knowledge in related areas such as cloud computing, database management, programming languages like Java or Python, and software development methodologies.
As you explore opportunities in the migrate job market in India, remember to showcase your skills and experience confidently during interviews. Prepare thoroughly, stay updated on industry trends, and demonstrate your passion for data migration. Best of luck on your job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2