Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0 years
4 - 10 Lacs
Hyderābād
On-site
- Bachelor's degree in computer science, engineering, mathematics or equivalent, or experience in a professional field - Experience in architecting solutions leveraging Data Analytics, machine learning, AI and Generative AI. - Experience in design, implementation, or consulting in applications and infrastructures - Experience communicating across technical and non-technical audiences, including executive level stakeholders or clients - Knowledge of AWS services, market segments, customer base and industry verticals AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small and mid-market accounts to enterprise-level customers including public sector. As an Amazon Web Services (AWS) Solutions Architect at FSI segment, you are responsible for partnering with our most valuable customers to design cloud architectures utilizing AWS services. You have technical depth, business acumen, and the ability to lead in-depth technology discussions, while articulating the business value of the AWS platform and services. Effective communication and interpersonal skills are required for engaging and influence Enterprise Architects, Technical Decision Makers, Cloud Architects, Directors, VP’s, and CXOs. You will partner some of the world’s largest companies, to craft highly scalable, flexible and resilient cloud architectures that address customer business problems and accelerate the adoption of AWS services. In collaboration with account managers, you will assist in driving growth across a small set of global customers in your defined country. As a trusted customer advocate, the solutions architect will help organizations understand best practices around advanced cloud-based solutions, including how to migrate workloads to the cloud. You will have the opportunity to help shape and execute a strategy to build mind-share and broad use of AWS within enterprise customers. The ability to connect technology with measurable business value is critical to a solutions architect. You should also have a demonstrated ability to think strategically about business, products, and technical challenges. Here are some other qualities we are looking for: At AWS, we have a credo of “Work hard. Have fun. Make history”. In this role, you will love what you do, and instinctively know how to make work fun. You will be dynamic and creative, and willing to take on any challenge and make a big impact. Enjoy working with large, global customers as an active contributor to a diverse team. You will have a passion for educating, training, designing, and building cloud solutions for a diverse and challenging set of enterprise customers. You will enjoy keeping your existing technical skills honed and developing new ones, so you can make strong contributions to deep architecture discussions. You will regularly take part in deep-dive education and design exercises to create world-class solutions built on AWS. Key job responsibilities - Ensure success in building and migrating applications, software and services on the AWS platform - Architect solutions leveraging AWS Data and Analytics, machine learning, AI and Generative AI specific services, working closely with customers to deeply understand their business needs and design technical solutions that optimize the use of the AWS Cloud platform. - Partner with an account team to understand customer’s business and desired outcomes and working backwards from those to deliver technical solutions that meet your customer’s needs. - Educate customers on the value proposition of AWS, and participate in deep architectural discussions to ensure solutions are designed for successful deployment in the cloud - Conduct one-to-few and one-to-many training sessions to transfer knowledge to customers considering or already using AWS - Capture and share best-practice knowledge amongst the AWS solutions architect community - Author or otherwise contribute to AWS customer-facing publications such as blogs or prescriptive guidance - Build deep relationships with senior technical individuals within customers to enable them to be cloud advocates - Act as a technical liaison between customers, service engineering teams and support About the team Diverse Experiences Amazon values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why flexible work hours and arrangements are part of our culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship and Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Cloud Technology Certification (such as Solutions Architecture, Cloud Security Professional or Cloud DevOps Engineering) Experience of working with financial services customers Cloud Technology Certification (such as Solutions Architecture, Cloud Security Professional or Cloud DevOps Engineering) Experience in infrastructure architecture, database architecture and networking Experience in migrating or transforming legacy customer solutions to the cloud Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 2 weeks ago
2.0 years
0 - 0 Lacs
India
On-site
We are looking for a highly skilled Shopify & WordPress Developer to manage, maintain, and enhance both our Shopify and WordPress websites. This role involves development, performance optimization, troubleshooting, and close collaboration with marketing and design teams to deliver seamless user experiences. Responsibilities: - Convert Figma designs into responsive web pages using WordPress (Elementor, HTML, CSS, JS) and/or Shopify (Liquid, HTML, CSS, JS). - Customize themes, plugins, and apps for WordPress/Shopify. - Integrate third-party APIs and resolve conflicts. - Test features for functionality and design accuracy. - Migrate and manage multiple websites. - Implement security measures and provide ongoing support. - Collaborate with team members and meet deadlines. Qualifications: - 2 years of experience with WordPress and/or Shopify development. - Strong knowledge of HTML, CSS, JavaScript, AJAX, jQuery. - Experience with Elementor and Liquid (Shopify). - Proficiency in PHP and database management (MySQL/Shopify equivalents). - Familiarity with RESTful APIs and GraphQL. - Ability to translate Figma designs into responsive, pixel-perfect websites. - Experience working on international projects. Why Us?: - Global Exposure: Work with clients worldwide. - Creative Freedom: Implement innovative ideas. - Learning & Development: Access workshops and certifications. - Collaborative Culture: A creative, supportive team environment. Job Types: Full-time, Permanent Pay: ₹25,000.00 - ₹35,000.00 per month Benefits: Leave encashment Schedule: Day shift Work Location: In person
Posted 2 weeks ago
0 years
0 Lacs
Chennai
On-site
Develop Business Objects reports and dashboard solutions for a wide variety of business intelligence projects and write functional and technical specs Solid technical background and experience in reporting, Business Intelligence applications Understand and implement solution in support of physical data models necessary to support business intelligence reporting initiatives Design, develop, test, and support SQL stored procedures, functions Communicate directly with other BI team members to confirm requirements and clarify business rules Define and design the universe, BI Dashboards and reports Develop, Support and improve existing BO systems including both Universe and Reports Responsible for documenting modifications of pre-existing development new development for peer reference and knowledge transfer At least basic Linux/Weblogic skills BO Java SDK beneficial but not essential (in case he needs to migrate the workflow tool) Strong exp in Developing Business Objects reports and dashboard solutions Design, develop, test, and support SQL stored procedures, functions Define and design the universe, BI Dashboards and reports About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 weeks ago
2.0 years
6 - 10 Lacs
Chennai
On-site
- 2+ years of software development, or 2+ years of technical support experience - Bachelor's degree in engineering or equivalent - Experience troubleshooting and debugging technical systems - Experience in Unix - Experience scripting in modern program languages This role is part of the rekindle returnship program, “Note: For more details on rekindle program, pls visit - https://www.amazon.jobs/en/landing_pages/rekindle” Work hard. Have fun. Make history. The mission of the Catalog Support and Programs (CSP) team is to provide a single point of contact for item-related problems and issues related to all retail and merchant catalog. As a member of Amazon Selection and Catalog Systems team, you’ll play a key role in driving Amazon’s business. You will be responsible for monitoring the data flow as well as meeting ticket SLAs and driving rootcause resolution defects. The Amazon Selection and Catalog Systems team is responsible for the systems that allow our business units to provide customers with the largest, highest quality, and most up to date selection in the world. You will play a key role in supporting our business teams worldwide by providing critical product support, carrying out data research, liaising with technology and other internal teams on workflow improvements, data interpretation and data improvements and help providing solutions that drive ongoing improvements to the quality of Amazon’s catalogs. This role requires an individual with excellent analytical abilities and business acumen. The successful candidate will be a self-starter, comfortable with ambiguity, have strong attention to detail, and will be comfortable accessing and working with data from multiple sources. The candidate should also have strong communication skills, enabling them to work with key business stakeholders to understand requirements and shape analytical deliverables. Candidate should also have a demonstrated ability to think strategically and analytically about business, product, and technical challenges, with the ability to work cross-organizationally. A keen sense of ownership and drive is a must. The role will work with a diverse set of data and cross-functional teams as well as use data to drive process improvement. An ideal engineer is one who enjoys discovering and solving ambiguous problems, can quickly learn complex systems, and enjoys building actionable insights from data. To meet these challenges we are looking for passionate, talented and super-smart support engineers. We are looking of people who innovate, love solving hard problems and never take no for an answer. Our engineers are top-notch engineers, who work hard, have fun and make history. Key job responsibilities Big Picture: solve problems at their root, stepping back to understand the broader context Proactive: You display energy and initiative in solving problems. You follow all possible avenues to get the job done Adaptable: You undertake a variety of tasks willingly. You switch from complex to routine tasks when required. You adapt quickly to new technologies and products. You work effectively with a variety of personalities and work styles Quality: You demonstrate appropriate quality and thoroughness. Integrity: You act with personal integrity at all times Professional: You work within your team’s process. You confront problems (even when outside your own domain), propose solutions, take ownership through to resolution or ensure a clear hand-off. You have a positive, can-do approach to work Migrate the metadata and business rules from existing manual templates into Unified Platform to provide new listing experiences to internal customers. Analyze and fix inconsistencies of existing metadata and business rules Use problem solving and analytical skills to solve business problems and drive process improvements 2+ years of scripting language experience 2+ years of technical support experience Experience with AWS, networks and operating systems Experience programming with at least one modern language such as Java, C++, or C# including object-oriented design Experience documenting technical customer issues Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 2 weeks ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Note If shortlisted, we’ll contact you via WhatsApp and email. Please monitor both and respond promptly. This role is located in Hyderabad. Candidates willing to relocate are welcome to apply. Location: Hyderabad Work Mode: Work From Office Salary: ₹13,00,000 – ₹22,00,000 INR Joining Time / Notice Period: Immediate – 30 Days About The Client – A top-tier tech consulting firm specializing in data engineering, AI, and automation. With deep expertise in digital transformation and cloud solutions, the company helps businesses make smarter, data-driven decisions and optimize operations. Job Purpose Seeking an experienced and detail-oriented Data Engineer to join a growing data engineering team. This role involves building and optimizing scalable ELT pipelines using Snowflake and dbt, working on cloud data architecture, and collaborating with analysts, architects, and other engineers to deliver validated, business-ready datasets. Key Responsibilities Build and maintain ELT pipelines using dbt on Snowflake Migrate and optimize SAP Data Services (SAP DS) jobs to cloud-native platforms Design and manage layered data architectures (staging, intermediate, mart) Apply performance tuning techniques like clustering, partitioning, and query optimization Use orchestration tools such as dbt Cloud, Airflow, or Control-M Develop modular SQL, write tests, and follow Git-based CI/CD workflows Collaborate with data analysts/scientists to gather requirements and document solutions Contribute to knowledge sharing through reusable dbt components and Agile ceremonies Must-Have Skills 3–10 years of Data Engineering experience Strong hands-on with Snowflake, dbt, SQL, and Azure Data Lake Basic proficiency in Python for scripting and automation Experience with SAP DS for legacy system integration Understanding of data modeling (preferably dimensional/Kimball) Familiarity with RBAC, GDPR, and data privacy best practices Git-based version control and CI/CD exposure Show more Show less
Posted 2 weeks ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Note If shortlisted, we’ll contact you via WhatsApp and email. Please monitor both and respond promptly. This role is located in Hyderabad. Candidates willing to relocate are welcome to apply. Location: Hyderabad Work Mode: Work From Office Salary: ₹13,00,000 – ₹22,00,000 INR Joining Time / Notice Period: Immediate – 30 Days About The Client – A top-tier tech consulting firm specializing in data engineering, AI, and automation. With deep expertise in digital transformation and cloud solutions, the company helps businesses make smarter, data-driven decisions and optimize operations. Job Purpose Seeking an experienced and detail-oriented Data Engineer to join a growing data engineering team. This role involves building and optimizing scalable ELT pipelines using Snowflake and dbt, working on cloud data architecture, and collaborating with analysts, architects, and other engineers to deliver validated, business-ready datasets. Key Responsibilities Build and maintain ELT pipelines using dbt on Snowflake Migrate and optimize SAP Data Services (SAP DS) jobs to cloud-native platforms Design and manage layered data architectures (staging, intermediate, mart) Apply performance tuning techniques like clustering, partitioning, and query optimization Use orchestration tools such as dbt Cloud, Airflow, or Control-M Develop modular SQL, write tests, and follow Git-based CI/CD workflows Collaborate with data analysts/scientists to gather requirements and document solutions Contribute to knowledge sharing through reusable dbt components and Agile ceremonies Must-Have Skills 3–10 years of Data Engineering experience Strong hands-on with Snowflake, dbt, SQL, and Azure Data Lake Basic proficiency in Python for scripting and automation Experience with SAP DS for legacy system integration Understanding of data modeling (preferably dimensional/Kimball) Familiarity with RBAC, GDPR, and data privacy best practices Git-based version control and CI/CD exposure Show more Show less
Posted 2 weeks ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Note If shortlisted, we’ll contact you via WhatsApp and email. Please monitor both and respond promptly. This role is located in Hyderabad. Candidates willing to relocate are welcome to apply. Location: Hyderabad Work Mode: Work From Office Salary: ₹13,00,000 – ₹22,00,000 INR Joining Time / Notice Period: Immediate – 30 Days About The Client – A top-tier tech consulting firm specializing in data engineering, AI, and automation. With deep expertise in digital transformation and cloud solutions, the company helps businesses make smarter, data-driven decisions and optimize operations. Job Purpose Seeking an experienced and detail-oriented Data Engineer to join a growing data engineering team. This role involves building and optimizing scalable ELT pipelines using Snowflake and dbt, working on cloud data architecture, and collaborating with analysts, architects, and other engineers to deliver validated, business-ready datasets. Key Responsibilities Build and maintain ELT pipelines using dbt on Snowflake Migrate and optimize SAP Data Services (SAP DS) jobs to cloud-native platforms Design and manage layered data architectures (staging, intermediate, mart) Apply performance tuning techniques like clustering, partitioning, and query optimization Use orchestration tools such as dbt Cloud, Airflow, or Control-M Develop modular SQL, write tests, and follow Git-based CI/CD workflows Collaborate with data analysts/scientists to gather requirements and document solutions Contribute to knowledge sharing through reusable dbt components and Agile ceremonies Must-Have Skills 3–10 years of Data Engineering experience Strong hands-on with Snowflake, dbt, SQL, and Azure Data Lake Basic proficiency in Python for scripting and automation Experience with SAP DS for legacy system integration Understanding of data modeling (preferably dimensional/Kimball) Familiarity with RBAC, GDPR, and data privacy best practices Git-based version control and CI/CD exposure Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Calcutta
On-site
Title - Microsoft 365 Cloud Engineer Exp - 5+ years These Are Your Responsibilities Actively contribute to the design and implementation of modular, cloud-based systems Support development teams in individual projects within the company environment and for external clients Configure and maintain cloud solutions in accordance with best practices and security guidelines Identify, analyze, and resolve infrastructure vulnerabilities and application deployment issues, and provide improvement suggestions Interact with clients, provide cloud support, and offer recommendations based on customer needs Cloud Engineers are IT professionals who design, implement, and manage cloud-based systems for businesses. They develop and deploy cloud applications and migrate existing on-premise applications to the cloud. What You Bring Experience and skills in the following areas (Microsoft 365 experience is a plus): Exchange Online, SharePoint Online, OneDrive for Business, Microsoft Teams, Microsoft Power Platform Strong quality awareness, analytical and solution-oriented working style Passion for new technologies and a good sense for usability, user experience, and innovative solutions Good communication and collaboration skills Team spirit, resilience, self-initiative, and a strong interest in continuous learning Requirement: Microsoft 365 Certified: Enterprise Administrator Experte Project experience: What we would be looking for is more around this In-depth knowledge of Microsoft 365 services (Exchange Online, SharePoint, Teams, OneDrive): Experience with Entra ID for identity and access management PowerShell scripting for automating administrative tasks Network and cloud knowledge , especially regarding hybrid environments Understanding of security and compliance policies in Microsoft 365 Experience: Administration and configuration of Microsoft 365 environments Troubleshooting and support for end users and IT teams Implementation of security measures such as multi-factor authentication (MFA) and threat protection Experience with migrations from on-premises systems (e.g., SharePoint Server, File Share) to the cloud or tenant-to-tenant (T2T) Collaboration with other IT teams to optimize infrastructure Soft Skills: Strong communication skills to explain technical concepts clearly Analytical thinking to solve problems efficiently Proactive willingness to learn
Posted 2 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
Remote
Job title: Mainframe Developer Location: Pune or Gurugram Job type; Full time We are open for both location Pune & GGN Employees relocating to Pune or Gurugram (GGN) from other cities/states are granted a two-month transition period (i.e., remote for 2 months) to ensure a seamless move We are looking for mainframe developer with experience migrating Mainframe DB2 or IMS to databases like Oracle/PostgreSQL/MySQL/Casandra etc. Export customer data from legacy databases such as Db2, IMS, ADABAS, or IDMS Migrate legacy database files to newly configured database solutions on PostgreSQL on Linux and Windows environments in the cloud Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Greater Bengaluru Area
On-site
Job Description: We are looking for a Data Scientist with expertise in Python, Azure Cloud, NLP, Forecasting, and large-scale data processing. The role involves enhancing existing ML models, optimising embeddings, LDA models, RAG architectures, and forecasting models, and migrating data pipelines to Azure Databricks for scalability and efficiency. Key Responsibilities: Model Development Model Development & Optimisation Train and optimise models for new data providers, ensuring seamless integration. Enhance models for dynamic input handling. Improve LDA model performance to handle a higher number of clusters efficiently. Optimise RAG (Retrieval-Augmented Generation) architecture to enhance recommendation accuracy for large datasets. Upgrade Retrieval QA architecture for improved chatbot performance on large datasets. Forecasting & Time Series Modelling Develop and optimise forecasting models for marketing, demand prediction, and trend analysis. Implement time series models (e.g., ARIMA, Prophet, LSTMs) to improve business decision-making. Integrate NLP-based forecasting, leveraging customer sentiment and external data sources (e.g., news, social media). Data Pipeline & Cloud Migration Migrate the existing pipeline from Azure Synapse to Azure Databricks and retrain models accordingly - Note: this is required only for the AUB role(s) Address space and time complexity issues in embedding storage and retrieval on Azure Blob Storage. Optimise embedding storage and retrieval in Azure Blob Storage for better efficiency. MLOps & Deployment Implement MLOps best practices for model deployment on Azure ML, Azure Kubernetes Service (AKS), and Azure Functions. Automate model training, inference pipelines, and API deployments using Azure services. Experience: Experience in Data Science, Machine Learning, Deep Learning and Gen AI. Design, Architect and Execute end to end Data Science pipelines which includes Data extraction, data preprocessing, Feature engineering, Model building, tuning and Deployment. Experience in leading a team and responsible for project delivery. Experience in Building end to end machine learning pipelines with expertise in developing CI/CD pipelines using Azure Synapse pipelines, Databricks, Google Vertex AI and AWS. Experience in developing advanced natural language processing (NLP) systems, specializing in building RAG (Retrieval-Augmented Generation) models using Langchain. Deploy RAG models to production. Have expertise in building Machine learning pipelines and deploy various models like Forecasting models, Anomaly Detection models, Market Mix Models, Classification models, Regression models and Clustering Techniques. Maintaining Github repositories and cloud computing resources for effective and efficient version control, development, testing and production. Developing proof-of-concept solutions and assisting in rolling these out to our clients. Required Skills & Qualifications: Hands-on experience with Azure Databricks, Azure ML, Azure Synapse, Azure Blob Storage, and Azure Kubernetes Service (AKS). Experience with forecasting models, time series analysis, and predictive analytics. Proficiency in Python (NumPy, Pandas, TensorFlow, PyTorch, Statsmodels, Scikit-learn, Hugging Face, FAISS). Experience with model deployment, API optimisation, and serverless architectures. Hands-on experience with Docker, Kubernetes, and MLflow for tracking and scaling ML models. Expertise in optimising time complexity, memory efficiency, and scalability of ML models in a cloud environment. Experience with Langchain or equivalent and RAG and multi-agentic generation Location: DGS India - Bengaluru - Manyata N1 Block Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 3 weeks ago
0.0 - 2.0 years
0 Lacs
Rohini, Delhi, Delhi
On-site
We are looking for a highly skilled Shopify & WordPress Developer to manage, maintain, and enhance both our Shopify and WordPress websites. This role involves development, performance optimization, troubleshooting, and close collaboration with marketing and design teams to deliver seamless user experiences. Responsibilities: - Convert Figma designs into responsive web pages using WordPress (Elementor, HTML, CSS, JS) and/or Shopify (Liquid, HTML, CSS, JS). - Customize themes, plugins, and apps for WordPress/Shopify. - Integrate third-party APIs and resolve conflicts. - Test features for functionality and design accuracy. - Migrate and manage multiple websites. - Implement security measures and provide ongoing support. - Collaborate with team members and meet deadlines. Qualifications: - 2 years of experience with WordPress and/or Shopify development. - Strong knowledge of HTML, CSS, JavaScript, AJAX, jQuery. - Experience with Elementor and Liquid (Shopify). - Proficiency in PHP and database management (MySQL/Shopify equivalents). - Familiarity with RESTful APIs and GraphQL. - Ability to translate Figma designs into responsive, pixel-perfect websites. - Experience working on international projects. Why Us?: - Global Exposure: Work with clients worldwide. - Creative Freedom: Implement innovative ideas. - Learning & Development: Access workshops and certifications. - Collaborative Culture: A creative, supportive team environment. Job Types: Full-time, Permanent Pay: ₹25,000.00 - ₹35,000.00 per month Benefits: Leave encashment Schedule: Day shift Experience: shopify & wordpress: 2 years (Required) Location: Rohini, Delhi, Delhi (Preferred) Work Location: In person
Posted 3 weeks ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Gruve Gruve is an innovative software services startup dedicated to transforming enterprises to AI powerhouses. We specialize in cybersecurity, customer experience, cloud infrastructure, and advanced technologies such as Large Language Models (LLMs). Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks. Position Summary We are seeking a highly skilled Cloud Administrator with expertise in cloud services, particularly AWS and Azure, to join our growing team. This role involves managing and optimizing cloud infrastructure, ensuring secure cloud operations, and providing solutions to common vulnerabilities flagged by the security team. The ideal candidate will have extensive experience in cloud migration, IAM management, and a solid understanding of both AWS and Azure services. In addition, familiarity with cloud monitoring tools and SAP will be advantageous. Key Responsibilities Cloud Infrastructure Management: Administer and manage AWS cloud services, including IAM, EC2, EBS, EFS, Lambda, CloudWatch, NLB, Log Insights, and Amazon Kinesis. Implement cloud solutions to migrate workloads from on-premises to the cloud. Design, deploy, and maintain scalable and reliable cloud environments for various business applications. Security And Compliance Work closely with the security team to identify and address common vulnerabilities and threats within the cloud environment. Ensure compliance with industry standards and best practices for cloud security. Cloud Solutions Design and Implementation: Take a holistic approach to business problems, developing end-to-end solutions that include design, procurement, implementation, and ongoing operations. Provide technical leadership and guidance on cloud adoption and migration strategies. Azure Management Administer Azure subscriptions, Azure AD, and Azure Monitor for cloud operations and security management. Leverage Azure Sentinel for security information and event management (SIEM). Cloud Monitoring and Optimization: Utilize cloud monitoring tools such as Datadog or Dynatrace to optimize performance and ensure high availability. Troubleshoot cloud-based applications, services, and infrastructure to maintain optimal uptime and performance. Collaboration Work with cross-functional teams, including developers, system engineers, and architects, to ensure smooth cloud operations and successful cloud migrations. SAP Integration (Optional): Provide basic support or integration services for SAP in the cloud environment. Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent work experience). Minimum of 4 years of experience in cloud administration and cloud migration, with hands-on experience in AWS services such as IAM, EC2, EBS, EFS, Lambda, and CloudWatch. Strong experience and a solid understanding of IT infrastructure fundamentals, with expertise in both Linux and Windows systems, as well as hands-on cloud experience. Solid experience with Azure cloud services, including Azure Sentinel, Azure AD, and Azure Monitor. Experience with cloud security practices, vulnerability management, and incident response. Strong understanding of cloud architecture and best practices for security and scalability. Familiarity with cloud monitoring tools like Datadog, Dynatrace, or similar. Basic knowledge of SAP is a plus. Problem-solving and troubleshooting skills, with the ability to look at business problems holistically and provide effective solutions. Excellent communication and collaboration skills. Certifications (Preferred but not required): AWS Certified Solutions Architect – Associate or Professional Microsoft Certified: Azure Solutions Architect Expert Certified Kubernetes Administrator (CKA) or similar. Preferred Qualifications Familiarity with CloudFormation, Terraform, or other Infrastructure as Code (IaC) tools. Strong experience in cloud cost optimization strategies. Experience with hybrid cloud and multi-cloud environments. Why Gruve At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you. Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
India
On-site
Reporting to the Implementation Team Lead, the Implementation / Configuration Consultant will work closely with our implementation team to provide an accurate system build as per customer specifications. You will be responsible for configuring the system and contributing to the successful implementation of our SaaS systems post sales as well as working on integration with other Cloud/SaaS technologies, as required. Responsibilities The core role of the Implementation Configuration Consultant includes but is not limited to: Ensure that a plan is in place with each Implementation Consultant to schedule work based on priority. Provide clear communication of build status, issues, risks and effort. Understand customer use cases and requirements, applying best practice. Carry out technical deployment and configuration tasks. Implement inbound and outbound data interfaces, as required. Migrate client data into the application database with high quality and integrity. Set up single sign-on with client domains. Perform load testing of client application instances with representative data. Assist with troubleshooting customer issues during implementation. Assist with mobile deployment. Undertake duties as requested by the implementation team. Skills Technology savvy, with the ability to deliver exceptional customer service. Excellent written and verbal communication skills. Numerate with an attention to detail. Excellent call handling skills. Ability to quickly build rapport with customers. Ability and awareness to work as part of a team. Willingness and flexibility to support the demands of the business. Methodical approach to problem-solving and the ability to make complex situations sound simple. Work well under pressure and on your own initiative without day-to-day supervision. Organised and efficient. Experience Familiarity with implementing SaaS solutions is advantageous. Proven experience in a client-facing environment. Project Management experience. Experience of business application deployment cycles. Familiarity with mobile platforms, including iOS, Android, WebOS is advantageous. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
India
On-site
In this role, you will work closely with clients to provide an excellent implementation experience. You will be responsible for ensuring the successful implementation of SaaS systems post-sales as well as advising on integration with other Cloud/SaaS technologies. Responsibilities Manage a number of client implementations concurrently, including managing the project plan and project resources. Ensure that a plan is in place for each engagement for; deployment, change and adoption management. Clear communication of implementation status, issues, risks and effort. Understand customer use cases and requirements, applying best practices. Carry out technical deployment and configuration tasks. Implement inbound and outbound data interfaces. Migrate client data into the application database with high quality and integrity. Set up single sign-on with client domains. Perform load testing of client application instances with representative data. Troubleshooting customer issues during implementations. Assisting with mobile deployment. Undertake duties as requested by the Customer Success Manager Skills & Knowledge Working knowledge of the SaaS environment. Attention to detail. Good written and verbal communication. Experience Familiarity with implementing SaaS solutions is advantageous. Proven experience in a client facing environment. Project Management experience is desirable. Experience of business application deployment cycles. Familiarity with mobile platforms, including iOS, Android, WebOS is advantageous. Accounting experience is beneficial but not essential. Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
SAP SD Consultant Role & responsibilities: Preferred candidate profile: - 5+ years of experience in S/4Hana, Implementation exposure is must and support. Other Details: Experience: 5+ years Location: Gurgaon Desirable: Experienced in implementation & Support in S4/Hana, Hands-on experience in SAP SD S/4Hana Work Mode: Work From Office Immediate Joining (30 days Notice Period Preferred) Company Description Infocus Technologies Pvt Ltd is a Kolkata-based consulting company that provides SAP, ERP & Cloud consulting services. The company is an ISO 9001:2015 DNV certified, CMMI Level 3 Certified company, and a Gold partner of SAP in Eastern India. Infocus helps customers to migrate and host SAP infrastructure on AWS cloud. Its services in the ERP domain include implementation, version upgrades, and Enterprise Application Integration (EAI) solutions. Show more Show less
Posted 3 weeks ago
7.0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
work location also can be Chennai , India Gurugram , India Hyderabad, IN , India Indore , India Kolkata , India Mumbai , India Noida , India Pune , India Project Description: We have an ambitious goal to migrate a legacy system written in HLASM (High-Level Assembler) from the mainframe to a cloud-based Java environment for one of the largest banks in the USA. Responsibilities: - Mandatory work from DXC office 5 days per week - Ensure successful delivery of projects - Achieve financial targets - Manage stakeholders' expectations - Drive presales & business development activities - People management activities - Project manager activities: 1. Pre-sale 2. Contracting 3. Project start 4. Project Planning and Execution 5. Team Management 6. Self-management Mandatory Skills Description: - 7+ years of professional experience in software development projects and management - 5+ years of experience in large scale Project/Program management - Proven track record in complex projects with global teams - People management experience for 20+ FTEs - Good understanding of the program financials and reporting - Good understanding of project manager areas of responsibility: - Project Planning - Backlog prioritization, detailing and decomposition - Understanding of SDLC and ability to build/optimize project processes - Communication with the team, client - Understanding of motivation factors - Team Management - Result-oriented - Working with feedbacks: provide and receive - Verbal and written business communication skills - Presentations skills - Negotiation skills - Master's Degree in computer science or similar education Nice-to-Have Skills Description: - Business domain experience in Banking, Healthcare, Travel or Retail industries - Scrum, Kanban Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Develop Business Objects reports and dashboard solutions for a wide variety of business intelligence projects and write functional and technical specs Solid technical background and experience in reporting, Business Intelligence applications Understand and implement solution in support of physical data models necessary to support business intelligence reporting initiatives Design, develop, test, and support SQL stored procedures, functions Communicate directly with other BI team members to confirm requirements and clarify business rules Define and design the universe, BI Dashboards and reports Develop, Support and improve existing BO systems including both Universe and Reports Responsible for documenting modifications of pre-existing development new development for peer reference and knowledge transfer At least basic Linux/Weblogic skills BO Java SDK beneficial but not essential (in case he needs to migrate the workflow tool) Strong exp in Developing Business Objects reports and dashboard solutions Design, develop, test, and support SQL stored procedures, functions Define and design the universe, BI Dashboards and reports Show more Show less
Posted 3 weeks ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Hi, Exp: 5-8 Years Job Description: 1. Design and implement migration strategies for moving on-premises Multitenant Oracle databases to Oracle Exadata Cloud at Customer (ExaCC). 2. Plan and execute the migration of on-premises Oracle databases to Exadata on OCI. 3. Design and implement migration strategies to minimize downtime and ensure data integrity like Dataguard, DATA PUMP, RMAN restore. 4. Assess existing on-premise infrastructure and applications to determine the best migration approach. 5. Utilize Oracle Cloud VMware Solution to migrate virtual machines (VMs) to OCI. 6. Perform on-premises data migration to ExaCC, ensuring minimal downtime and data integrity. 7. Evaluate and utilize various database migration services available in OCI. 8. Collaborate with cross-functional teams to ensure seamless migration and integration with minimal disruption to business operations. 9. Develop and maintain migration documentation, including architecture diagrams, migration plans,runbooks including timelines, milestones, and resource allocation. 10. Provide technical guidance and support throughout the migration process. 11. Conduct thorough risk assessments and develop mitigation strategies to ensure successful migration. 12. Ensure compliance with organizational security and data governance policies. 13. Implement CMAN to manage and monitor database migrations. Mandate tech stack: · Oracle OCI · Oracle DBA · Data Migration · ExaCC (good to have) If interested please share with me deepika.eaga@quesscorp.com Show more Show less
Posted 3 weeks ago
0.0 - 11.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
About the Role: Grade Level (for internal use): 11 What’s in it for you: Build a career with a global company Exposure to work on the world’s leading ETF & Benchmarking solutions Good work-life balance Roles and Responsibilities: Work with a fantastic group of people in a supportive environment where training, learning and growth are embraced Design, develop, and maintain cloud-based applications using .NET technologies Migrate legacy components to modern cloud architecture, ensuring scalability, reliability, and security. Implement and manage AWS cloud native implementations Enhance application security measures and implement best practices to safeguard sensitive data. Collaborate with cross-functional teams to gather requirements, design solutions, and deliver high-quality software on time. Take ownership of projects, from concept to delivery, ensuring adherence to project timelines and objectives. Stay updated on industry trends and advancements in cloud computing, .NET frameworks, and related technologies. Translate financial requirements into technical solutions, demonstrating a strong understanding of financial terms and processes. Work as part of an agile team to identify and deliver solutions to prioritized requirements As a self-driven professional, enhance the security of the applications & platform. Familiar with various design and architectural patterns as per relevant experience. Must demonstrate strong expertise in system design, architectural patterns, and building efficient, scalable systems. Required Qualifications & Experience: Bachelor's degree in Computer Science, Engineering, or related field. 7 to 11 years of experience in software development with a focus on .NET technologies. Proficiency in .NET 6+, SQL Server, PostgreSQL, JavaScript, GIT and Angular. Sound hands on experience on AWS Cloud Technologies: S3, Lambda, SNS, SQS, RDS, Step functions and similar Strong understanding of agile methodologies and experience working in agile environments. Excellent problem-solving skills and the ability to work independently or as part of a team. Exceptional communication skills, with the ability to articulate technical concepts to non-technical stakeholders. Proven track record of delivering high-quality software solutions within deadlines. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316259 Posted On: 2025-05-29 Location: Noida, Uttar Pradesh, India
Posted 3 weeks ago
40.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description What You’ll Do:Come and join us! Analyze functional and operational issues reported by the customer with their Oracle EPM/Hyperion environment and provide the solution or workaround on time. Perform root cause analyses and recommend meaningful updates to the following functional and operational items: dimensionality and hierarchies, business rules code, metadata properties, data forms and reports, data management processes and others as needed. Provide input to help guide the development of customer solutions (pre-implementation), as well as address customer questions and concerns regarding the functionality of their Oracle EPM/Hyperion environments (post-implementation, ongoing maintenance) Collaborate with offshore team members and business customers globally. Customer Management Ability to understand customer urgency and sensitivity of the problem. Strong Verbal and Written communication skills Ability to speak confidently and communicate clearly with the customer. Strong Adherence to Process and be process champion. Ability to work well in a demanding customer environment and delight customers. Career Level - IC3 Responsibilities Skills Required Experience with the functional and operational aspects of Oracle EPM Cloud (SaaS) suite products: application design, development of various application artifacts such as forms and rules, testing, troubleshooting (working in conjunction with Oracle Support as needed), pre- and post-implementation activities across EPM product suite (Cloud) Experience integrating EPM with other systems using Data Management, adaptors, etc. Experience in interacting with business users to analyze the business process and discovering requirements Extensive hands-on experience in at least two (and preferably more) of the following modules: EPBCS PCMCS FCCS TRCS ARCS EDMCS Narrative Reporting Data Management Ability to maintain Optimum availability of EPM Environments Ability to create, monitor, and migrate product-related application artifacts proactively Deep functional knowledge around financial systems and processes Strong problem-solving skills (from an Applications/Functional/Operational perspective) with the ability to exercise mature judgment Knowledge of EPM Automate; scripting a plus (Batch, VBScript, Python, and/or PowerShell) Ability to work in a fast-paced environment and master unfamiliar concepts quickly with little supervision. Diversity & Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. . Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, interview process, and in potential roles. to perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data Services Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your role involves creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop and maintain data solutions for data generation, collection, and processing. - Create data pipelines to ensure efficient data flow. - Implement ETL processes for data migration and deployment. - Collaborate with team members to optimize data solutions. - Conduct data quality assessments and implement improvements. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on experience implementing various machine learning algorithms. - Solid grasp of data munging techniques for data cleaning and transformation. Additional Information: - The candidate should have a minimum of 3 years of experience in Google BigQuery. - This position is based at our Mumbai office. - A 15 years full-time education is required. 15 years full time education Show more Show less
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary Oracle Analytics Cloud (OAC) Consultant As an OAC Consultant, you will work with technical teams and projects to deliver cutting edge reporting solutions on Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to build multi-cloud Enterprise Analytics platforms around Oracle and to migrate on-premise OBIEE platforms to OAC. You will closely work with clients to focus on outcome and deliver value, and work with Cloud Architects to develop high performance cloud analytics applications. Our teams have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As OAC developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing applications to Oracle cloud infrastructure. Another type of project might involve building reporting solution on both on- premise and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to understand business requirements, document user stories and focus on user experience build Proof-of-concept to showcase value of Oracle Analytics vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of OAC partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand dependencies on source applications analyze data sets to understand functional and business context understand Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status to key stakeholders Design, build, test and deploy OAC repository (RPD) data models OAC classic dashboards and reports with focus on user experience DV workbooks with focus on exploratory analysis BI Publisher data models and reports BI Agents Data and Object level security High performance reporting platform to deliver a seamless user experience Focus on designing, building and documenting re-usable code artifacts Track, report and optimize OAC performance to meet client SLA Migrate OBIEE artifacts from on-premise to OAC and fix issues Identify risks and suggest mitigation plan Document all processes and procedures in assigned areas of responsibility Technical Requirements Education: BE./B.Tech/M.C.A./M.Sc (CS) Strong knowledge of SQL and PL/SQL is foundational to optimize query performance Strong knowledge of RPD development focused on complex cross functional data models Experience of working with OBIA RPD and FAW Data modeler is desirable Experience in designing complex OAC reports and dashboards and interactive workbooks in DV Experience in designing data flows to create business defined data sets Experience in designing and configuring IDCS Groups and Application roles for Object level security and Data level security in OAC Experience with one or more relational Oracle Databases (ADW and ExaCS preferred) Experience with ODI tool to back track integration design Prior experience of OBIA implementation, OBIEE to OAC migration or OBIEE upgrades is desirable Fair understanding of the agile and waterfall development process Excellent understanding of Oracle Cloud platform and services Any Oracle Cloud Certifications in OAC, ADW or Architect certification is a plus Consulting Requirements 4-8 years of relevant consulting, industry or technology experience Proven knowledge of Oracle Analytics Cloud platform and tools Independent, detail oriented, responsible team player with strong interpersonal and written communication skills Ability to work in fast paced environment and deal with ambiguity Coach and mentor junior resources Strong problem solving and troubleshooting skills Willingness to travel in case of project requirement Ability to adapt quickly to an existing, complex environment and learn new concepts and software Passionate about building high-performance cloud based applications Professionalism and integrity Preferred Experience in Oracle BI Apps (RPD, Data model and ERP source design) Business Process understanding of RTR, PTP, OTC, ATR, and PTC Experience with Fusion Analytic Warehouse Data modeler, augmentation and security design Experience of performance tuning in Cloud based Databases such as ADW and ExaCS Oracle Cloud Platform Enterprise Analytics certified Exposure to one or more of the following: Python, R or UNIX shell Experience in Machine Learning Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 303088 Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data Services Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop and maintain data pipelines. - Ensure data quality throughout the data lifecycle. - Implement ETL processes for data migration and deployment. - Collaborate with cross-functional teams to understand data requirements. - Optimize data storage and retrieval processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery. - Strong understanding of data architecture principles. - Experience with cloud-based data services. - Knowledge of SQL and database management systems. - Hands-on experience with data modeling and schema design. Additional Information: - The candidate should have a minimum of 3 years of experience in Google BigQuery. - This position is based at our Pune office. - A 15 years full-time education is required. 15 years full time education Show more Show less
Posted 3 weeks ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description This role is part of the rekindle returnship program, “Note: For more details on rekindle program, pls visit - https://www.amazon.jobs/en/landing_pages/rekindle” Work hard. Have fun. Make history. The mission of the Catalog Support and Programs (CSP) team is to provide a single point of contact for item-related problems and issues related to all retail and merchant catalog. As a member of Amazon Selection and Catalog Systems team, you’ll play a key role in driving Amazon’s business. You will be responsible for monitoring the data flow as well as meeting ticket SLAs and driving rootcause resolution defects. The Amazon Selection and Catalog Systems team is responsible for the systems that allow our business units to provide customers with the largest, highest quality, and most up to date selection in the world. You will play a key role in supporting our business teams worldwide by providing critical product support, carrying out data research, liaising with technology and other internal teams on workflow improvements, data interpretation and data improvements and help providing solutions that drive ongoing improvements to the quality of Amazon’s catalogs. This role requires an individual with excellent analytical abilities and business acumen. The successful candidate will be a self-starter, comfortable with ambiguity, have strong attention to detail, and will be comfortable accessing and working with data from multiple sources. The candidate should also have strong communication skills, enabling them to work with key business stakeholders to understand requirements and shape analytical deliverables. Candidate should also have a demonstrated ability to think strategically and analytically about business, product, and technical challenges, with the ability to work cross-organizationally. A keen sense of ownership and drive is a must. The role will work with a diverse set of data and cross-functional teams as well as use data to drive process improvement. An ideal engineer is one who enjoys discovering and solving ambiguous problems, can quickly learn complex systems, and enjoys building actionable insights from data. To meet these challenges we are looking for passionate, talented and super-smart support engineers. We are looking of people who innovate, love solving hard problems and never take no for an answer. Our engineers are top-notch engineers, who work hard, have fun and make history. Key job responsibilities Big Picture: solve problems at their root, stepping back to understand the broader context Proactive: You display energy and initiative in solving problems. You follow all possible avenues to get the job done Adaptable: You undertake a variety of tasks willingly. You switch from complex to routine tasks when required. You adapt quickly to new technologies and products. You work effectively with a variety of personalities and work styles Quality: You demonstrate appropriate quality and thoroughness. Integrity: You act with personal integrity at all times Professional: You work within your team’s process. You confront problems (even when outside your own domain), propose solutions, take ownership through to resolution or ensure a clear hand-off. You have a positive, can-do approach to work Migrate the metadata and business rules from existing manual templates into Unified Platform to provide new listing experiences to internal customers. Analyze and fix inconsistencies of existing metadata and business rules Use problem solving and analytical skills to solve business problems and drive process improvements Basic Qualifications 2+ years of software development, or 2+ years of technical support experience Bachelor's degree in engineering or equivalent Experience troubleshooting and debugging technical systems Experience in Unix Experience scripting in modern program languages Preferred Qualifications 2+ years of scripting language experience 2+ years of technical support experience Experience with AWS, networks and operating systems Experience programming with at least one modern language such as Java, C++, or C# including object-oriented design Experience documenting technical customer issues Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ Job ID: A2994248 Show more Show less
Posted 3 weeks ago
40.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: Let’s do this. Let’s change the world. We are looking for highly motivated expert Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks, with deep domain knowledge of Manufacturing and/or Process Development and/or Supply Chain in biotech or life sciences or pharma. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain complex ETL/ELT data pipelines in Databricks using PySpark, Scala, and SQL to process large-scale datasets Understand the biotech/pharma or related domains & build highly efficient data pipelines to migrate and deploy complex data across systems Design and Implement solutions to enable unified data access, governance, and interoperability across hybrid cloud environments Ingest and transform structured and unstructured data from databases (PostgreSQL, MySQL, SQL Server, MongoDB etc.), APIs, logs, event streams, images, pdf, and third-party platforms Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring Expert in data quality, data validation and verification frameworks Innovate, explore and implement new tools and technologies to enhance efficient data processing Proactively identify and implement opportunities to automate tasks and develop reusable frameworks Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions Must-Have Skills: Deep domain knowledge of Manufacturing and/or Process Development and/or Supply Chain in biotech or life sciences or pharma. Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Data Engineering experience in Biotechnology or pharma industry Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and DevOps Education and Professional Certifications Master’s degree and 3 to 4 + years of Computer Science, IT or related field experience OR Bachelor’s degree and 5 to 8 + years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for migrate professionals in India is currently thriving, with numerous opportunities available in various industries. Whether you are just starting your career or looking to make a job transition, migrate roles can offer a rewarding career path with growth opportunities.
These cities are known for their booming IT sectors and have a high demand for migrate professionals.
The average salary range for migrate professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can command salaries upwards of INR 10-15 lakhs per annum.
A typical career path in the migrate field may involve starting as a Junior Developer, progressing to a Senior Developer, then moving up to a Tech Lead role. With experience and expertise, one could further advance to roles like Solution Architect or Project Manager.
In addition to migrate skills, professionals in this field are often expected to have knowledge in related areas such as cloud computing, database management, programming languages like Java or Python, and software development methodologies.
As you explore opportunities in the migrate job market in India, remember to showcase your skills and experience confidently during interviews. Prepare thoroughly, stay updated on industry trends, and demonstrate your passion for data migration. Best of luck on your job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.