Home
Jobs

1002 Migrate Jobs - Page 10

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Minimum qualifications: Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field, or equivalent practical experience. 4 years of experience in developing and troubleshooting data processing algorithms. Experience coding with one or more programming languages (e.g., Java, Python) and Bigdata technologies such as Scala, Spark and hadoop frameworks. Experience with one public cloud provider, such as GCP. Preferred qualifications: Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments. Experience in Big Data, information retrieval, data mining, or Machine Learning. Experience with data warehouses, technical architectures, infrastructure components, Extract Transform and Load/Extract, Load and Transform and reporting/analytic tools, environments, and data structures. Experience in building multi-tier applications with modern technologies such as NoSQL, MongoDB, SparkML, and TensorFlow. Experience with Infrastructure as Code and Continuous Integration/Continuous Deployment tools like Terraform, Ansible, Jenkins. Understanding one database type, with the ability to write complex SQL queries. About the job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Strategic Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze, and explore/visualize data on the Google Cloud Platform. You will work on data migrations and modernization projects, and with customers to design data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform/product challenges. You will have an understanding of data governance and security controls. You will travel to customer sites to deploy solutions and deliver workshops to educate and empower customers. Additionally, you will work with Product Management and Product Engineering teams to build and constantly drive excellence in our products. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Interact with stakeholders to translate complex customer requirements into recommendations for appropriate solution architectures and advisory services. Engage with technical leads, and partners to lead high velocity migration and modernisation to Google Cloud Platform (GCP). Design, Migrate/Build and Operationalise data storage and processing infrastructure using Cloud native products. Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data. Take various project requirements and organize them into clear goals and objectives, and create a work breakdown structure to manage internal and external stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 1 week ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Data Services Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with cross-functional teams to gather requirements and deliver data solutions that meet business needs. - Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services. - Good To Have Skills: Experience with Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics. - Strong understanding of data modeling and database design principles. - Experience with data integration and ETL tools. - Familiarity with data governance and data quality best practices. Additional Information: - The candidate should have minimum 2 years of experience in Microsoft Azure Data Services. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education

Posted 1 week ago

Apply

13.0 years

0 Lacs

Delhi

On-site

GlassDoor logo

Overview The key Account Manager reports to the India Business Manager for Toxicology. Responsible for building and maintaining strong relationships distributors and where possible the end user clients. Uses their sales, market and relationship skills to identify growth opportunities, negotiate contracts, work to resolve issues, driving business growth and client satisfaction Responsibilities Technical / Operational Possess and apply detailed product knowledge as well as thorough knowledge of client's business. Responsible for the direct sales process, aiming at meeting and/or exceeding sales targets. Oversees sales expansion, introduce new products/services to clients and organize visits to current and potential clients. Submit short and long-range sales plans and prepare sales strategies utilizing available marketing programs to reach nominated targets. Responsible for retaining long-term customer relationships with established clients. Ensure that clients receive high quality customer service. Inform clients of new products and services as they are introduced, migrate information to appropriate sales representative when clients have additional service needs. Internal Systems and Processes Enhance knowledge of CRM Sales Force SFDC Lightening Adherence to company’s reporting deadlines and governance framework Manage the development of systems and processes that ensure efficient delivery of Toxicology products and services. Customers Work closely with country business manager to help identify growth opportunities, sales direction Management of end user customer and distributor relationaships Involvement in distributor contract management. Financial Achieve monthly, quarterly and annual revenue targets Manage delegated operational expenditure to within budget Report weekly, monthly and annually to required internal partners Conduct Ensure all activities carried out by self are in accordance with legislative employment policies, health & safety requirements and corporate policy Promote a standard of excellence for quality and customer focus at Abbott Promote awareness of compliance requirements throughout the organisation Uphold Abbott’s Code of Business Conduct Live our Abbott Values – Pioneering, Achieving, Caring, Enduring Reporting to Business Manager Toxicology India Qualifications and Experience Essential Education level - Associates Degree (± 13 years) Min 3 Years of experience in a similar role, preferably within medical device or consumable sales or security/police sales. Desirable Post Graduate Business qualification Knowledge of Toxicology industry and major participants Competencies and Attributes Technical / Operational Negotiation skills Experienced in working with Global or Regional Marketing or Commercial Excellence. An innovative solutions developer and provider Proven ability to develop relationships at all levels of an organization Proficient in current marketing practices and principles Well-developed written and verbal communication skills Highly developed presentation skills Internal Systems and Processes Proficiency in SalesForce.com & PowerBI: highly regarded Ability to utilise business software e.g. MS Office, MRP systems, CRM systems Ability to plan and prioritise work according to business needs and change focus when required Customers and external stakeholders Strong interpersonal communication skills Highly competent oral and written communication skills Highest levels of integrity and diplomacy Capacity to maintain the highest levels of confidentiality internally and externally

Posted 1 week ago

Apply

7.0 - 8.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Location Gurgaon, India Category Digital Technology Job ID: R87219 Posted: Jun 9th 2025 Job Available In 2 Locations Senior Build & Automation Engineering Do you enjoy working in collaborative teams and solving critical issues? Would you enjoy designing innovative energy products? Join our cutting- edge Software Development Team Baker Hughes' Digital Technology team provide and create tech solutions to cater to the needs of our customers. As a global team we collaborative to provide cutting-edge solutions to solve our customer's problems. We support them by providing materials management, planning, inventory and warehouse solutions. Partner with the best As a Senior Build & Automation Engineer, You'll be responsible for automating and supporting infrastructure and software delivery process. You'll migrate existing system or new digital product of the organization following BH IET digital landing zone pattern, infrastructure as code and Azure/AWS cloud manage service capabilities. As a Senior Build & Automation Engineer, you will be responsible for: Developing a deep understanding of continuous delivery (CD) theory and DevSecOps culture, concepts and real-world application of them. Having experience with CD tools and systems, but you’ll need intimate knowledge of their inner workings for integrating different tools and systems together in order to create fully functioning, cohesive delivery pipelines. Committing, merging, building, testing, packaging and deploying code all come into play within the software release process Shipping a new application to production is great, but it’s even better if you know what it’s actually doing. Ensuring that an application and the systems it runs on implement appropriate monitoring, logging and alerting solutions. Understanding observability tools and systems that you might utilize in this space include syslog, azure monitoring, Prometheus and Grafana dynatrace and others Ensuring that the systems under your purview are built in a repeatable manner, using Infrastructure as Code (IaC) tools such as azure bicep Using IaC ensures that cloud objects are documented as code, version controlled, and that they can be reliably replaced using an appropriate IaC provisioning tool. Fuel your passion To be successful in this role you will: Have a Bachelor Degree in Engineering or Technical discipline with minimum 7-8 years of working experience. Have 6-8 years of experience with DevSecOps, Identity Access Management. Have Experience with software configuration management tools such as Git/Gitlab Have Experience with software development environments and CI/CD tools such as Jenkins Have a good understanding of containers principal and kubernetes orchestration. Have knowledge in cloud computing and azure manage services will be plus. Have proficient communication skills to teach the team various concepts like scalability, automation, and security and excellent collaboration skills. Able to demonstrate clarity of thinking to work through limited information and vague problem definitions Work in a way that works for you We recognize that everyone is different and that the way in which people want to work and deliver at their best is different for everyone too. In this role, we can offer the following flexible working patterns: Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive Working with us Our people are at the heart of what we do at Baker Hughes. We know we are better when all of our people are developed, engaged and able to bring their whole authentic selves to work. We invest in the health and well-being of our workforce, train and reward talent and develop leaders at all levels to bring out the best in each other. Working for you Our inventions have revolutionized energy for over a century. But to keep going forward tomorrow, we know we have to push the boundaries today. We prioritize rewarding those who embrace change with a package that reflects how much we value their input. Join us, and you can expect: Contemporary work-life balance policies and wellbeing activities Comprehensive private medical care options Safety net of life insurance and disability programs Tailored financial programs Additional elected or voluntary benefits About Us: We are an energy technology company that provides solutions to energy and industrial customers worldwide. Built on a century of experience and conducting business in over 120 countries, our innovative technologies and services are taking energy forward – making it safer, cleaner and more efficient for people and the planet. Join Us: Are you seeking an opportunity to make a real difference in a company that values innovation and progress? Join us and become part of a team of people who will challenge and inspire you! Let’s come together and take energy forward. Baker Hughes Company is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law.

Posted 1 week ago

Apply

0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Minimum qualifications: Bachelor’s degree in Engineering, Computer Science, a related field, or equivalent practical experience. Experience coding with one or more programming languages (e.g., Java, C/C++, Python). Experience troubleshooting technical issues for internal/external partners or customers. Preferred qualifications: Experience in distributed data processing frameworks and modern age investigative and transactional data stores. Experience in working with/on data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools, environments, and data structures. Experience in big data, information retrieval, data mining. Experience in building multi-tier, high availability applications with modern technologies such as NoSQL, MongoDB. Experience with Infrastructure as Code (IaC) and Continuous Integration/Continuous Delivery (CICD) tools like Terraform, Ansible, Jenkins etc. Understanding of at least one database type with the ability to write complex SQLs. About the job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. As a Strategic Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze, and explore/visualize data on the Google Cloud Platform. You will work on data migrations and modernization projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform/product challenges. You will have an in-depth understanding of data governance and security controls. You will travel to customer sites to deploy solutions and deliver workshops to educate and empower customers. Additionally, you will work closely with Product Management and Product Engineering teams to build and constantly drive excellence in our products.Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Interact with stakeholders to translate complex customer requirements into recommendations for appropriate solution architectures and advisory services. Engage with technical leads, and partners to lead high velocity migration and modernization to Google Cloud Platform (GCP). Design, Migrate/Build and Operationalize data storage and processing infrastructure using Cloud native products. Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data. Take various project requirements and organize them into clear goals and objectives, and create a work breakdown structure to manage internal and external stakeholders. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 1 week ago

Apply

2.0 - 4.0 years

5 - 9 Lacs

Gurgaon

On-site

GlassDoor logo

As the global leader in high-speed connectivity, Ciena is committed to a people-first approach. Our teams enjoy a culture focused on prioritizing a flexible work environment that empowers individual growth, well-being, and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. About Ciena: Ciena is a global leader in networking systems, services, and software. We build the foundational networks that connect the world, and we're passionate about driving innovation and delivering exceptional value to our customers. Join our team and be part of a company that's shaping the future of connectivity. Job Summary: We are seeking a highly motivated and experienced Modern End Point Analyst. You will be responsible for supporting, maintaining and improving, the end user hardware and OS, as well as the management, administration, and optimization of the platform. This will help Ciena to ensure we have a reliable, compliant, and flawless device experiences for our end points and their users at Ciena. You will also participate in the support of SCCM infrastructure as well as discovery and continued implementation of Intune based management, enabling automation, refining security, and supporting hybrid work at scale. Critical in this role will be the ability to collaborate with internal teams and external partners including application teams, infrastructure groups and contract manufacturing teams. Responsibilities Overall Architect and coordinate Microsoft Intune for managing Auto Pilot including device compliance, configuration profiles, app deployment, and endpoint security. Build scalable configuration policies and migrate legacy GPOs to new Entra ADMX and Intune environments. Manage and optimize SCCM infrastructure, including upgrades, role assignments, and integration with Intune for co-management. Drive SCCM to Intune migrations, encouraging modern endpoint management for Windows and macOS. Enforce standard methodologies for endpoint security, such as BitLocker for Endpoint, and Conditional Access. Collaborate efficiently on patch and vulnerability management using tools to minimize risk. Manage cross platform enterprise application delivery with a focus on secure access and user experience. Endpoint & Application Management Manage and support over 10,000 endpoint devices including provisioning, configuration, deployment, and health monitoring. Troubleshoot advanced issues on endpoints from laptops to Cloud PC’s including hardware, OS, network, application related problems. Support commercial off-the-shelf and custom applications as installed on endpoints. Collaborate with application teams to resolve system and application-related issues. Security, Monitoring & Reporting Utilize management tools to ensure endpoint protection is up to date and functional. Conduct regular endpoint health assessments and patch management using tools such as SCCM, InTune, NexThink, JAMF Prepare and deliver weekly/monthly reports on endpoint status and performance. Technical Support & Issue Resolution Provide Level 2/3 support for endpoint and system related incidents. Perform root cause analysis (RCA) and implement permanent resolutions. Work with partners and other IT streams to test, deploy, and validate system, network, or application fixes and upgrades. Provide timely support to end users and document issue resolution steps clearly. Offer off-hours/on-call support as required to prevent or minimize service disruptions. Documentation & Projects Participate in IT projects, including matrix-managed initiatives, requiring endpoint expertise. Create and maintain user manuals, SOPs, and support documentation. Collaborate with users to understand functional needs and improve endpoint support effectiveness. Perform other technical duties as assigned. Execute change management procedures without business disruption. Process review and optimization Identify inefficiencies in existing endpoint management and support processes by conducting regular process reviews. Apply operational excellence principles to continuously improve service delivery, ensuring alignment with organizational goals. Leverage Lean practices to eliminate non-value-added tasks, reduce cycle times, and improve responsiveness. Recommend and implement strategies to remove process waste and automate repeatable tasks wherever possible. Develop and promote optimized workflows that improve consistency, reliability, and performance of endpoint operations. Collaborate with peers and stakeholders to standardize best practices across teams, enhancing overall IT support maturity. Required: Self-starter attitude 2–4 years of experience in endpoint or system administration. Proficient with enterprise endpoint management tools (SCCM, Intune, Jamf) Strong skills in diagnosing hardware, OS (Windows), network (LAN/WAN), and endpoint issues. Demonstrated experience supporting endpoints in enterprise environments. Excellent collaboration skills with cross-functional teams and external vendors. Comfortable providing off-hours support as needed. Strong experience managing Microsoft SCCM or Intune. Expertise in managing Windows, iOS, MacOS, and Android devices. Proven troubleshooting skills and excellent documentation capabilities. Ability to communicate effectively and conduct user training. Preferred: Bachelor’s degree in Information Technology, Engineering, or a related discipline. Experience working with mission-critical services in high-availability environments. Exposure to Microsoft SQL Server or web-based application support (as a bonus asset). Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require.

Posted 1 week ago

Apply

4.0 - 7.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Introduction In this role, you will work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. You will collaborate with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you will be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground-breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience Your Role And Responsibilities A Data Engineer specializing in enterprise data platforms, experienced in building, managing, and optimizing data pipelines for large-scale environments. Having expertise in big data technologies, distributed computing, data ingestion, and transformation frameworks. Proficient in Apache Spark, PySpark, Kafka, and Iceberg tables, and understand how to design and implement scalable, high-performance data processing solutions.What you’ll do: As a Data Engineer – Data Platform Services, responsibilities include: Data Ingestion & Processing Designing and developing data pipelines to migrate workloads from IIAS to Cloudera Data Lake. Implementing streaming and batch data ingestion frameworks using Kafka, Apache Spark (PySpark). Working with IBM CDC and Universal Data Mover to manage data replication and movement. Big Data & Data Lakehouse Management Implementing Apache Iceberg tables for efficient data storage and retrieval. Managing distributed data processing with Cloudera Data Platform (CDP). Ensuring data lineage, cataloging, and governance for compliance with Bank/regulatory policies. Optimization & Performance Tuning Optimizing Spark and PySpark jobs for performance and scalability. Implementing data partitioning, indexing, and caching to enhance query performance. Monitoring and troubleshooting pipeline failures and performance bottlenecks. Security & Compliance Ensuring secure data access, encryption, and masking using Thales CipherTrust. Implementing role-based access controls (RBAC) and data governance policies. Supporting metadata management and data quality initiatives. Collaboration & Automation Working closely with Data Scientists, Analysts, and DevOps teams to integrate data solutions. Automating data workflows using Airflow and implementing CI/CD pipelines with GitLab and Sonatype Nexus. Supporting Denodo-based data virtualization for seamless data access Preferred Education Master's Degree Required Technical And Professional Expertise 4-7 years of experience in big data engineering, data integration, and distributed computing. Strong skills in Apache Spark, PySpark, Kafka, SQL, and Cloudera Data Platform (CDP). Proficiency in Python or Scala for data processing. Experience with data pipeline orchestration tools (Apache Airflow, Stonebranch UDM). Understanding of data security, encryption, and compliance frameworks Preferred Technical And Professional Experience Experience in banking or financial services data platforms. Exposure to Denodo for data virtualization and DGraph for graph-based insights. Familiarity with cloud data platforms (AWS, Azure, GCP). Certifications in Cloudera Data Engineering, IBM Data Engineering, or AWS Data Analytics Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Bhubaneshwar

On-site

GlassDoor logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education

Posted 1 week ago

Apply

15.0 years

0 Lacs

Chennai

On-site

GlassDoor logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Chennai

On-site

GlassDoor logo

Closing on: Jul 10, 2025 About Doyensys Doyensys is a Management & Technology Consulting company with expertise in Enterprise applications, Infrastructure Platform Support, and solutions. Doyensys helps clients to harness the power of innovation to thrive on change. The company leverages its technology expertise, global talent, and extensive industry experience to deliver powerful next-generation IT services and solutions. Doyensys Inc has operations in India, the US, Mexico, and Canada. Job Requirement Project Role: PL/SQL Developer with Microsoft Power Platform Experience Work Experience: 4 to 6 years Work Location: Chennai – Only Work from Office Project Role Description We are seeking a dynamic professional with strong hands-on experience in Oracle PL/SQL and a good working knowledge of Microsoft Power Platform tools like Power Apps, Power Automate, and Power Platform Copilot. Experience with Oracle Forms/Reports is an added advantage but not mandatory. Technical Expertise Must Have Skills: Oracle PL/SQL: Writing optimized SQL queries, stored procedures, functions, packages, and triggers. Performance tuning and debugging PL/SQL code. Knowledge of exception handling, collections, cursors, and dynamic SQL. Working with complex joins, data modeling, and relational database design. Tools: Oracle SQL Developer, TOAD, or similar IDEs. MSD Power Apps (Canvas & Model-driven): Designing and developing custom apps using Canvas and Model-driven frameworks. Understanding of Common Data Service (Dataverse) and its schema design. Using connectors to integrate with SharePoint, SQL Server, Outlook, and other data sources. Implementing business rules, validations, forms, and custom controls. MSD Power Automate: Designing cloud flows and automated workflows triggered by events, schedules, or actions. Integration with Microsoft 365 apps (Excel, Outlook, Teams), third-party APIs, and on-premises data gateways. Error handling, conditions, and advanced expressions in flow actions. MSD Power Platform Copilot (AI-assisted Development): Familiarity with using Copilot to generate app logic, formulas, and flow suggestions. Ability to validate and edit AI-generated logic as per business needs. General Microsoft Ecosystem Knowledge: SharePoint Online (as data source or integration). Power BI (basic understanding is a plus). Microsoft Teams (integration via Power Apps or Automate). Good To Have Skills: Oracle Forms & Reports: Basic understanding of Oracle Forms and Reports development and maintenance. Ability to support or migrate legacy Oracle applications. Key Responsibilities Design and develop database procedures, functions, and packages using Oracle PL/SQL. Build custom business applications using Microsoft Power Apps (Canvas and Model-Driven). Automate manual and repetitive workflows using Microsoft Power Automate. Use Power Platform Copilot to assist in app and workflow development. Translate business needs into functional and technical requirements. Perform data modeling, integration, and transformation across platforms. Collaborate with business users and technical teams to deliver solutions. (Optional) Maintain legacy applications built using Oracle Forms and Reports. Professional Attributes Ability to work independently and handle tasks with minimal supervision. Strong team player who can collaborate and extend support to peers. Solution-oriented with an initiative-driven mindset. Willingness to take ownership and ensure quality delivery. Educational Qualification MCA, B.E, M.Sc (CS) or any other equivalent degree with relevant work experience. Oracle SQL / PLSQL Certification. Behavioral Attributes Passionate and enthusiastic about work. Confident in expressing opinions during discussions with team members. Flexible in the workplace with an open mind to accept feedback, willingness to try new things, ability to take on additional responsibility, and readiness to extend work hours when required.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

About the Job – We are looking for SAP BASIS Consultant. Educational Background – Any Graduate. Experience- 6+ years. L ocation-Okhla,Delhi. Job Description: Bachelor's degree in IT or equivalent work experience. Minimum of 5+years’ experience in SAP BASIS/Netweaver/ABAP Application Server monitoring and Administration. Familiar with S4HANA environments(2022). Expert in investigating and resolving issues related to system performance Establishing standards and requirements, evaluating, and directing enhancements or upgrades, implementing processes for performance monitoring, and system configuration, design, and implementation Support/perform SAP Basis Administration activities such as ID creation and transporting objects to different systems Document SAP processes, procedures, and plans; including changes, upgrades, and new services Work closely with the clients and business stakeholders to understand their pain points and requirements Install, configure, maintain, migrate or upgrade the SAP systems as required Look into and troubleshoot SAP Basis related issues Perform regular maintenance and performance Tuning for Database and SAP systems Have the knowledge about the system administration responsibilities. Possess in depth technical skills in managing SAP systems, HANA DBs, and integration interfaces. Possess experience with the review and coordination of impact analysis on application requirements, including functional, security, integration, performance, quality, and operations requirements. Able to translate them to technical architecture and system/server configuration requirements. Able to provide input into final decisions regarding hardware, network products, system software, and security. Pls share your latest cv to sucheta@volibits.com Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Rajkot, Gujarat, India

On-site

Linkedin logo

Company Description Belief Technolab is a comprehensive company specializing in website design, web development, and app development, dedicated to supporting SMEs, startups, and ecommerce enthusiasts in establishing and expanding successful online businesses. The company utilizes premier web and app development platforms to develop, design, customize, migrate, and maintain websites and online stores. Role Description This is a full-time on-site role for a Business Development Executive located in Rajkot. The Business Development Executive will be responsible for new business development, lead generation, business communication, and account management to drive growth and opportunities for the company. Qualifications New Business Development and Lead Generation skills Strong Business acumen and Communication skills Excellent interpersonal and negotiation skills Ability to work independently and collaboratively Previous experience in sales or business development roles Bachelor's degree in Business Administration or related field Show more Show less

Posted 1 week ago

Apply

5.0 years

10 Lacs

Noida

On-site

GlassDoor logo

Project description We have an ambitious goal to migrate a legacy system written in HLASM (High-Level Assembler) from the mainframe to a cloud-based Java environment for one of the largest banks in the USA. Responsibilities We are looking for an experienced Java Developer who can help perform the migration of the client platform: Write Java code following the new architecture. Troubleshoot, debug, and resolve issues within the new Java system. Collaborate with client teams, to ensure alignment with project goals and deliver high-quality solutions. Maintain and fine-tune Gen AI application that supports the migration process. Mandatory work from the office 5 days per week. Skills Must have Proficiency in Java development (5+ years). Deep understanding of enterprise application architecture patterns. Strong problem-solving and debugging skills. Experience of work in distributed teams, with US customers. Excellent communication skills for collaboration with the client teams. Nice to have Experience with cloud platforms and services (preferably AWS). Experience with Python Language. Familiarity with large-scale system migrations and modernization efforts. Experience with HLASM or other low-level programming languages. Familiarity with Generative AI. Prior experience working in the banking or financial services industry. Knowledge of performance tuning and optimization in cloud environments. Other Languages English: B2 Upper Intermediate Seniority Senior Noida, India Req. VR-107927 Java BCM Industry 09/06/2025 Req. VR-107927

Posted 1 week ago

Apply

15.0 years

0 Lacs

Indore

On-site

GlassDoor logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education

Posted 1 week ago

Apply

0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Linkedin logo

Job Title: Odoo Developer Location: Onsite – Vadodara Job Type: Full-time Key Responsibilities: Design, develop, and customize Odoo modules (preferably version 17). Work on both backend (Python, PostgreSQL) and frontend (HTML, JS, XML) development. Develop and maintain custom reports, dashboards, and integrations with third-party services. Troubleshoot issues and ensure smooth performance of Odoo systems. Collaborate with functional consultants to gather requirements and implement appropriate technical solutions. Upgrade and migrate existing Odoo implementations to newer versions when required. Write clean, maintainable, and reusable code. Preferred Skills & Qualifications: Proven experience as an Odoo Developer; version 17 preferred but not mandatory. Strong knowledge of Python and PostgreSQL. Experience with Odoo’s ORM, workflows, and security mechanisms. Familiarity with Git or other version control systems. Basic understanding of JavaScript, HTML, CSS for UI customizations. Good analytical and problem-solving skills. Strong communication and teamwork abilities. Show more Show less

Posted 1 week ago

Apply

2.0 - 3.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Linkedin logo

Job Description – Digital Transformation and Automation Lead About the Role - Drive the digital backbone of a growing commercial real-estate group. - You’ll prototype, test and ship automations that save our teams > 10 hours/week in the first 90 days Total Experience - 2-3 years Availability ~40 hrs/week, 4 days on-site, 1 day remote Core Responsibilities 1. Systems Audit & Consolidation – unify Google Workspace tenants, rationalise shared drives. 2. Database & CRM Build-out – design, deploy, and maintain occupant tracker and a lightweight CRM; migrate legacy data. 3. Automation & Integration – link CRM, Google Sheets, and Tally using Apps Script/Zoho Flow/Zapier. 4. Process Documentation – own the internal wiki; keep SOPs and RACI charts current. 5. Dashboards & Reporting – craft Looker Studio boards for collections, projects, facility KPIs. 6. User Training & Support – deliver monthly clinics; teach teams how to use G Suite, ChatGPT to improve productivity 7. Security & Compliance – enforce 2FA, backup policies, basic network hygiene. 8. Vendor Co-ordination – liaise with Zoho, Tally consultants, ISP/MSP vendors; manage small capex items. Required Skills & Experience Domain Skill Level Workspace & Security ★ LAN/Wi-Fi basics & device hardening Core Automation & Low-Code ★ Apps Script or Zoho Creator/Flow; REST APIs & webhooks Core ★ Workflow bridges (Zapier / Make / n8n) Core • Cursor, Loveable, or similar AI-driven low-code tools Bonus Data Extraction & Integrations ★ Document AI / OCR stack for PDF leases (Google DocAI, Textract, etc.) Core ★ Tally Prime ODBC/API Core CRM & Customer-360 ★ End-to-end rollout of a CRM (Zoho/Freshsales) (migration, custom modules) Core • Help-desk tooling (Zoho Desk, Freshdesk) Bonus Analytics & Reporting ★ Advanced Google Sheets (ARRAYFORMULA, QUERY, IMPORTRANGE) and Looker Studio dashboards Core • Data-warehouse concepts (BigQuery/Redshift) for unified customer view Bonus Programming & Scripting ★ Python or Node.js for lightweight cloud functions / ETL Core ★ Prompt-engineering & Gen-AI APIs (OpenAI, Claude) for copilots Core Project & Knowledge Management • Trello (or equivalent Kanban) Bonus ★Notion / Google Sites for wiki & SOPs Core Soft Skills ★ Clear documentation & bilingual (English/Hindi) training; stakeholder comms Core Compensation - 40 – 50 k p.m Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Title Technical Lead - Architecture Location Gurugram-90C Department Digital Technology Function Enterprise Architecture and Solutions Reporting to Sr. Solution Architect Band 4A About Axis Max Life Insurance Axis Max Life Insurance Limited, formerly known as Max Life Insurance Company Ltd., is a Joint Venture between Max Financial Services Limited (“MFSL”) and Axis Bank Limited. Axis Max Life Insurance offers comprehensive protection and long-term savings life insurance solutions through its multi-channel distribution, including agency and third-party distribution partners. It has built its operations over two decades through a need-based sales process, a customer-centric approach to engagement and service delivery and trained human capital. For more information, please visit the company website at www.maxlifeinsurance.com. Job Summary Responsible for providing deep technical expertise in designing and delivering end to end high performance, scalable & flexible solutions using cutting edge/emerging technologies including Mobility/Responsive Web apps, APIs, application & data Integration, scalable databases, Analytics, DevOps, Cloud Computing (IaaS, PaaS and Containerization). He/ She would be also responsible to experiment, explore and demonstrate application of new technologies by means of conducting quick prototypes to solve business problems. Work with the direction from Solution Architect/ Enterprise Architect and co-create solutions with rest of the IT delivery teams Key Responsibilities Work closely with various business partners and other departments to create future proof solutions covering digital, automation, APIs, integration and data Provide technical expertise in solving/troubleshooting performance & other non-functional requirements Design integrations and patterns in accordance with architectural standards and drive changes to standards, policies and procedures based on input from service managers and service partners Support critical projects in all phases of delivery on a need basis Review Application, Integration and Solution Architecture to analyze the current IT ecosystem and develop opportunities for improvements Experiment, explore and demonstrate application of new technologies by means of conducting quick prototypes to solve business problems. Maintain good technical relationship with partner’s (Banca/broker/aggregators/vendors) technical teams. Other Responsibilities Defining and reviewing continuous delivery, continuous integration, continuous testing (DevOps pipelines) that serve the purpose of provisioning and quality code delivery Stakeholder management at strategic levels in both technical and business functions Focus on continuous service improvement and thought leadership, drive strategic initiatives to help business and partners to achieve their goals Measures of Success Alignment of IT landscape to overall vision and blue prints Faster, better and cheaper delivery of applications via technology Exceptional user(internal/external) experience, automation and operational efficiency by adoption of new cutting edge technology solutions to solve business problems Trusted partnership with other departments of IT and business Stay up to date with emerging technologies, industry trends and best practices Key skills required Extensive experience in areas of technologies like Java Frameworks/Javascipt, databases, queues, streams, AWS cloud serverless and containers. At least 2 years of designing hybrid cloud applications and migrate existing workloads to the cloud. Able to recommend the appropriate AWS service based on data, compute, database or security requirements. Demonstrated competency with application and data integration platforms (Mulesoft, Apigee, EDW, etc.) and patterns – SOA, APIs, Webservices, Microservices, ETL, Event Processing, BPM, ESB. Understanding of BFSI domain, Open Architecture and application integration discipline, concepts and best practices Define cost control mechanisms by suggesting architectural changes & optimizing resource utilization. Any prior experience on Implementation of AI, Data Analytics will be a plus. Desired qualification and experience B.E. / B. Tech / MCA 7-10 years of experience with hands-on software development background. Should have architected and delivered at least 2-3 large projects in a technology organization Show more Show less

Posted 1 week ago

Apply

0.0 - 2.0 years

0 Lacs

Noida, Uttar Pradesh

On-site

Indeed logo

As a PCB Design Engineer at Grid OS, you will be responsible for schematic entry, PCB layout, and related design activities, with strong expertise in analog and high-speed digital design. This role demands attention to detail, problem-solving capabilities, and a proactive mindset to ensure the manufacturability, functionality, and quality of PCB designs across various applications. Key Responsibilities: Perform schematic entry and PCB layout design. Design analog circuits and power supply layouts. Develop high-speed digital layout designs, including interfaces such as PCIe, USB, DDR3, etc. Derive PCB stack-ups and ensure adherence to signal and power integrity best practices. Understand and incorporate various I/O functionalities into designs. Create and verify footprints according to IPC standards. Conduct Gerber verification and ensure quality releases of Gerber files, BOMs, and drawings. Generate and maintain PCB design specifications and documentation. Ensure designs meet manufacturability and testability standards. Provide support and resolve technical queries related to PCB design tools. Collaborate with hardware design and development teams during board bring-up and testing phases. Perform CAM validation and liaise with PCB manufacturers and assembly units. Contribute to tool migration initiatives between different PCB design platforms. Requirements: Proficient in PCB design and schematic capture using tools like Altium Designer, Cadence, Mentor Graphics, or Protel 99 (Mandatory). Experience with analog simulation tools such as LTSpice. Skilled in symbol and footprint creation, drafting, DRC, and layout verification. Hands-on experience with CAM350 for PCB fabrication verification and optimization. Knowledge of HyperLynx for signal integrity analysis. Expertise in designing single, double, multi-layer, and Flex PCBs including high-speed, mixed-signal, power, and RF boards. Strong understanding of thermal management, EMI/EMC considerations, and signal/power integrity fundamentals. Ability to derive PCB stack-ups and apply constraint settings effectively. Basic knowledge of mechanical CAD tools such as AutoCAD and SolidWorks. Familiarity with industry standards, manufacturing practices, and compliance regulations. Ability to migrate design projects across different PCB design tools. Detail-oriented with a strong focus on design accuracy, manufacturability, and documentation. Strong decision-making skills to resolve design challenges and optimize PCB performance. Excellent communication and collaboration skills to work effectively with cross-functional teams. Job Type: Full-time Pay: ₹200,000.00 - ₹500,000.00 per year Benefits: Flexible schedule Paid sick time Paid time off Schedule: Day shift Ability to commute/relocate: Noida, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Required) Experience: PCB Designing: 2 years (Required) AutoCAD: 2 years (Required) Work Location: In person

Posted 1 week ago

Apply

7.5 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Linkedin logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. Show more Show less

Posted 1 week ago

Apply

12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 12 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. Show more Show less

Posted 1 week ago

Apply

7.5 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Data Officer Job #: req33197 Organization: World Bank Sector: Information Technology Grade: GF Term Duration: 2 years 0 months Recruitment Type: Local Recruitment Location: Chennai,India Required Language(s): English Preferred Language(s) Closing Date: 6/10/2025 (MM/DD/YYYY) at 11:59pm UTC Description Do you want to build a career that is truly worthwhile? Working at the World Bank Group provides a unique opportunity for you to help our clients solve their greatest development challenges. The World Bank Group is one of the largest sources of funding and knowledge for developing countries; a unique global partnership of five institutions dedicated to ending extreme poverty, increasing shared prosperity and promoting sustainable development. With 189 member countries and more than 130 offices worldwide, we work with public and private sector partners, investing in groundbreaking projects and using data, research, and technology to develop solutions to the most urgent global challenges. For more information, visit www.worldbank.org ITS Vice Presidency Context The Information and Technology Solutions (ITS) Vice Presidential Unit (VPU) enables the World Bank Group to achieve its mission of ending extreme poverty and boost shared prosperity on a livable planet by delivering transformative information and technologies to its staff working in over 150+ locations. For more information on ITS, see this video:https://www.youtube.com/watch?reload=9&v=VTFGffa1Y7w ITS shapes its strategy in response to changing business priorities and leverages new technologies to achieve three high-level business outcomes: business enablement, by providing Bank Group units with innovative digital tools and technologies to transform how they deliver value for their clients; empowerment & effectiveness, by ensuring that all Bank Group staff are connected, able to find information, and productive to accelerate the delivery of development solutions globally; and resilience, by equipping the Bank Group to provide risk-based cybersecurity and robust data protection for a global network and a growing cloud platform. Implementation of the strategy is guided by three core principles. The first is to deliver solutions for business partners that are customer-centric, innovative, and transformative. The second is to provide the Bank Group with value for money with selective and standard technologies. The third principle is to excel at the basics by providing a high performing, robust, and resilient IT environment for the organization. As a unit within the WB DataOperations and Technology office (ITSDOITSDOCorporate (ITSOC), the Data and Analytics unit (ITSDA) provides state-of-art information and technology applications to support the operations of the World Bank Group. Functions provided data, Information Managementensure that the systems meet the business needs of users and AI solutionsexternal clients to manage business processes for stakeholders across World Bank. The current technology landscape encompasses Cloud-based data platforms (Azure and AWS), Oracle, SQL Server, Business Objects, Tableau, Cisco Information Server (Composite), SAP BW/Hana, Informatica, .Net, HTML 5, CSS Frameworks, SharePoint and many others. Our plans are to migrate our on-prem data repositories and re-engineer based on new Cloud architectures in the coming years. Responsibilities Perform data analysis and create reusable assets for our Data & Analytics Portal, i.e., dashboard, data visualization & reports, including ad-hoc requests from clients. Analyze large datasets to identify trends, patterns, and insights, utilize tools and techniques to understand patterns. Ability to quickly grasp business insights and navigate through the data structures to assess the issue. Reverse engineering from reports, dashboards and applications through medallion architecture understand the business logic and document them. Work with cross-functional teams to understand data needs and provide analytical support. Develop solutions based on data analysis to address business challenges and Identify opportunities for process improvements through data insights. Document data processes, methodologies, and findings for future reference and maintain clear records of data sources and analysis methods. Identify and categorize source data (where the data originates) and establish a clear mapping between source and target fields. Analyze how changes will affect existing processes or systems and identify stakeholders impacted by data migration or integration. Develop validation checks to ensure data integrity post-migration and Conduct testing to confirm that the target meets requirements. Maintain comprehensive documentation of the analysis process and record decisions made, issues encountered, and resolutions. Work closely with data engineers to understand the target structures and design the semantic layer conducive for analytics. Work closely with Data governance team and business stakeholders to document the data elements metadata and report metadata. Compare source and target data structures to identify discrepancies and assess data quality issues, such as duplicates, missing values, or inconsistencies. Develop test plans, test scripts, automation procedures to test data and report quality Contribute, develop and maintain Enterprise Data Model Actively seeks knowledge needed to complete assignments and shares knowledge with others, communicating and presenting information in a clear and organized manner. Develop, maintain, support AI/ML models in support of various analytical and data science needs Selection Criteria Master's degree with 5 years’ experience OR equivalent combination of education and experience in relevant discipline such as Computer Science. Minimum 3 years of experience in each of the following areas: (i) SQL, Python, R or any programming language (ii) Reports and Dashboard (iii) Building analytics and troubleshooting issues (iv)Data Analysis. Ability to understand business requirements, decode it into data needs, correlate it with business processes and develop reporting, data requirements, data models etc. Excellent and proven skills in data modelling, data integration and understanding different ways of designing data schema. Hands on experience with cloud platforms covering Power BI platform & Tableau and covering on-prem Business Objects. Hands on experience in building semantic layer encompassing complex logic to accommodate reporting requirements. Good understanding on SAP BW/SAP HANA structures and decoding of business rules to migrate to Modern Cloud platforms. Knowledge of advance SQL programming skills to perform complex operations and data programming on large amount of data stored in data warehousing or lakes. Strong understanding of row and column-level security in the semantic layer to facilitate smooth reporting. Work with application team leads to refine and tighten the security framework and access control for internal and external data access points Ability and flexibility to learn and adapt to a spectrum of data technologies running on multiple platforms primarily on the Semantic layer modelling, Report building, API’s and Dashboards. Knowledge of building data warehouse applications in Hybrid environment both on-cloud and on-prem and ability to keep up to date with Cloud offerings and solutions in a global delivery environment. Ability to participate and collaborate within and across teams in developing options, roadmaps, evaluations, decision frameworks for complex enterprise solutions. Deep Experience in implementing and maintaining some of these tools such as Informatica Intelligent Cloud Services (IICS), Tableau, Tibco Data Virtualization, Collibra, Informatica MDM, Data Bricks, NoSQL Databases, PostgreSQL and Azure technologies is preferrable. Experience working on AI/ML, data science models is preferred. Proven experience in evaluating best of the breed tools in Data & Analytics and work closely with the leadership team to come up with pro and cons is preferred. Actively seeks knowledge needed to complete assignments and shares knowledge with others, communicating and presenting information in a clear and organized manner. Proven experience of working and navigating in teams with offshore/onsite model and collaborate across teams to build/maintain complex IT landscapes, and diverse client bases. Experience in finance, human resources, resource management, loans, and travel is preferred. Experience in writing unit/integration tests, work in agile iterative approach towards building products and documents work. Ability to deliver information effectively in support of team or workgroup. Excellent communication, writing/documentation, and facilitation skills. Ability to juggle multiple tasks in a fast-paced environment, and the maturity to participate in multiple complex programs at the same time in an agile environment. World Bank Group Core Competencies The World Bank Group offers comprehensive benefits, including a retirement plan; medical, life and disability insurance; and paid leave, including parental leave, as well as reasonable accommodations for individuals with disabilities. We are proud to be an equal opportunity and inclusive employer with a dedicated and committed workforce, and do not discriminate based on gender, gender identity, religion, race, ethnicity, sexual orientation, or disability. Learn more about working at the World Bank and IFC , including our values and inspiring stories. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Linkedin logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. Show more Show less

Posted 1 week ago

Apply

Exploring Migrate Jobs in India

The job market for migrate professionals in India is currently thriving, with numerous opportunities available in various industries. Whether you are just starting your career or looking to make a job transition, migrate roles can offer a rewarding career path with growth opportunities.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Chennai
  5. Mumbai

These cities are known for their booming IT sectors and have a high demand for migrate professionals.

Average Salary Range

The average salary range for migrate professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can command salaries upwards of INR 10-15 lakhs per annum.

Career Path

A typical career path in the migrate field may involve starting as a Junior Developer, progressing to a Senior Developer, then moving up to a Tech Lead role. With experience and expertise, one could further advance to roles like Solution Architect or Project Manager.

Related Skills

In addition to migrate skills, professionals in this field are often expected to have knowledge in related areas such as cloud computing, database management, programming languages like Java or Python, and software development methodologies.

Interview Questions

  • What is data migration and why is it important? (basic)
  • Can you explain the difference between ETL and ELT processes? (medium)
  • How do you handle data quality issues during migration? (medium)
  • What tools have you used for data migration in your previous projects? (basic)
  • Describe a challenging data migration project you worked on and how you overcame obstacles. (medium)
  • What are the different types of data migration strategies? (advanced)
  • How do you ensure data integrity during the migration process? (medium)
  • Explain the concept of data mapping in the context of data migration. (basic)
  • Have you worked with any data migration automation tools? If so, which ones? (medium)
  • What are some common challenges faced during data migration projects and how do you address them? (medium)
  • How do you prioritize data migration tasks in a project with tight deadlines? (medium)
  • Can you discuss the role of metadata in data migration? (advanced)
  • What are some best practices for data migration to ensure project success? (medium)
  • How do you handle data security and compliance issues during migration? (medium)
  • What considerations should be taken into account when migrating data to a cloud environment? (medium)
  • Explain the concept of data deduplication and its importance in data migration. (medium)
  • What role does data profiling play in the data migration process? (medium)
  • How do you ensure data accuracy and consistency post-migration? (medium)
  • Have you worked on any data migration projects involving legacy systems? If so, how did you approach them? (medium)
  • What is the difference between schema migration and data migration? (medium)
  • Describe a time when you had to roll back a data migration process. How did you handle it? (medium)
  • How do you handle stakeholder communication during a data migration project? (basic)
  • What are the key metrics you use to measure the success of a data migration project? (medium)
  • Can you explain the concept of data lineage and its importance in data migration? (advanced)

Closing Remark

As you explore opportunities in the migrate job market in India, remember to showcase your skills and experience confidently during interviews. Prepare thoroughly, stay updated on industry trends, and demonstrate your passion for data migration. Best of luck on your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies