Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
Pune, Maharashtra
Remote
A Snapshot of Your Day Siemens Energy “Digital Products and Solutions” team (DPS) supports all SE Business Areas in their growth of their digital business ! We consult, design and develop tailored solutions for the customer market, based on their specific technical and commercial constraints. Samples can be found here: Digital services (siemens-energy.com) Our second mission is to professionalize, automate and standardize SE software development. How You’ll Make an Impact Lead all aspects of and coordinate software development and cloud solutions, ensuring alignment with Siemens Energy technologies and process guidelines. Develop concepts and architecture, as well as review existing solutions and define modernization proposals. Responsible for the quality and timely delivery Provide technical guidance for the agile development team. Create technical specifications and drive the refinement of new features and user stories. Support effort and budget estimations and guide different developer teams. Conduct risk assessments based on considerations for information security, data protection, and related regulations. Perform code reviews and work in a flexible, multi-functional, team-based development environment. What You Bring A master’s or bachelor’s degree in computer science or a related field, with a demonstrated ability of 10 years in software development and software architecture. A curious and hard-working approach to problem-solving, building, and self-improvement. Accountability for outcomes, commitment to delivering high-quality results in a reliable manner. Deep knowledge of AWS and other cloud platforms, with experience in designing solutions; experience in the energy sector is an advantage. You are familiar with container technologies such as Docker, Podman, K8s Proven track record to collaborate with multi-functional product teams and engage directly with clients. Strong analytical skills, with a data-driven approach and proficiency in statistical analysis and programming languages such as Java. C#, Python, JavaScript/TypeScript and SQL A customer-centric, open-minded attitude with a commitment to continuous skill development. Exceptional communication skills, with fluency in English and the ability to operate optimally in a global, multicultural environment. Familiar with REST, GraphQL, gRPC, SOAP, microservices, multithreading, public cloud (AWS, Azure, Google), containers/orchestration, DDD, BFF, API Gateways, load balancers, service registry/discovery, and independent deployment. Knowledge of application/solution architecture principles and reference architecture. Experience with both static and dynamically typed languages (e.g., Java, Python, C#). Experience with DBMS, both relational and NoSQL, as well as data pipelines and identifying bottlenecks between data storage and the front end. Proficiency in DevOps CI/CD practices, e.g., Azure DevOps, Terraform, etc About the Team In the central Digital Products and Solutions (DPS) organization, our Software Application and Engineering department is responsible for developing software solutions for both internal and external customers. In DPS, our software products already cover a wide range of categories, and we see many opportunities for growth: Asset Performance Management, Energy Management, Asset Monitoring, Asset Health Prediction, Customer Portal & AI-assisted Applications, Connectivity & Edge, Backend Core / Domain / Platform Services, and Professional Services. Who is Siemens Energy? At Siemens Energy, we are more than just an energy technology company. We meet the growing energy demand across 90+ countries while ensuring our climate is protected. With ~100,000 dedicated employees, we not only generate electricity for over 16% of the global community, but we’re also using our technology to help protect people and the environment. Our global team is committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible. We uphold a 150-year legacy of innovation that encourages our search for people who will support our focus on decarbonization, new technologies, and energy transformation. Find out how you can make a difference at Siemens Energy: https://www.siemens-energy.com/employeevideo Our Commitment to Diversity Lucky for us, we are not all the same. Through diversity, we generate power. We run on inclusion and our combined creative energy is fueled by over 130 nationalities. Siemens Energy celebrates character – no matter what ethnic background, gender, age, religion, identity, or disability. We energize society, all of society, and we do not discriminate based on our differences. Rewards/Benefits Employees are eligible for Remote Working arrangements up to 2 days per week. All employees are automatically covered under the Medical Insurance. Company paid considerable Family floater cover covering employee, spouse and 2 dependent children up to 25 years of age. Siemens Energy provides an option to opt for Meal Card to all its employees which will be as per the terms and conditions prescribed in the company policy. – As a part of CTC, tax saving measure Flexi Pay empowers employees with the choice to customize the amount in some of the salary components within a defined range thereby optimizing the tax benefits. Accordingly, each employee is empowered to decide on the best Possible net income out of the same fixed individual base pay on a monthly basis Reference https://jobs.siemens-energy.com/jobs
Posted 3 hours ago
3.0 - 5.0 years
0 Lacs
India
Remote
The IT Helpdesk Analyst – L1 Support acts as the initial point of contact for all internal IT support issues across SOLM. This role focuses on providing frontline support to a global user base across multiple time zones and technologies, with a special emphasis on macOS, Windows, and SaaS tools. The analyst will log and manage tickets via Salesforce Service Cloud , provide timely responses, and deliver an excellent end-user experience. The role requires outstanding communication skills , a calm and soft-spoken demeanor , and the ability to liaise effectively with regional stakeholders and internal teams in a professional, culturally sensitive manner. End-User Support Serve as the first point of contact for all IT-related queries, incidents, and requests via Salesforce Service Cloud (ticketing system). Support end-users across macOS and Windows 10/11 environments. Provide remote assistance using Zoom, Teams, and other collaboration tools. Assist with access issues, password resets, MFA support, and basic software troubleshooting. Issue Troubleshooting & Escalation Troubleshoot hardware and software issues related to: Office 365 (Outlook, Teams, OneDrive) VPN, RDP, internet/network connectivity Printing, conference room equipment, endpoint configurations Escalate unresolved technical issues to L2/L3 support based on standard SLAs. Record all actions, communications, and outcomes in the ticketing system. Stakeholder Engagement & Communication Communicate technical solutions clearly to users with varying technical skill levels. Maintain professionalism when working with international colleagues and leadership. Exhibit excellent verbal and written English communication with a courteous and empathetic approach. Collaborate with internal teams and support global operations across North America, EMEA, LATAM, and APAC regions. Device and Access Management Assist with user onboarding/offboarding (Active Directory, Azure AD, O365). Perform basic macOS user configuration and application setup. Coordinate device provisioning, handoffs, and return logistics. Track assets in alignment with company inventory procedures. Security and Compliance Guide users on secure password practices and endpoint protection basics. Detect and report unusual behavior or phishing incidents to the security team. Ensure compliance with IT security policies in daily support tasks. Evaluate and recommend new technologies and solutions to improve operations. Drive innovation and continuous improvement within the IT infrastructure. Plan and execute technology upgrades and modernization efforts. Implement backup solutions and ensure data integrity and availability. Vendor Management and Collaboration : Manage relationships with vendors and service providers. Evaluate and select third-party solutions and services. Coordinate with external partners for support and services. Ensure vendor deliverables meet organizational standards and requirements. Communication Skills : Effective Communication : Clearly articulate technical concepts and solutions to both technical and non-technical stakeholders. Soft-spoken demeanor , and the ability to liaise effectively with regional stakeholders and internal teams in a professional, culturally sensitive manner. Essential Bachelor's Degree in Computer Science, Information Technology, or equivalent. 3-5 years in an IT Helpdesk or Desktop Support role. Soft-spoken and user-friendly demeanor – with an empathetic approach. Multi-regional collaboration – ability to support and engage across global teams. Problem-solving mindset – proactive, patient, and calm under pressure. Flexible to work on different time zone. Technical qualifications Exposure to ticketing systems like Salesforce Service Cloud , Jira, or ServiceNow. Technical Skills Operating Systems: Windows 10/11, Windows Servers, macOS (basic to intermediate level) ITSM Tools: Salesforce Service Cloud (must), Jira, Remedy Collaboration Tools: Microsoft 365, Teams, OneDrive, Zoom, Slack User Access: Active Directory, Azure AD, MFA tools Endpoint Security: Awareness of antivirus, phishing alerts, and endpoint monitoring basics
Posted 3 hours ago
8.0 years
0 Lacs
Hyderabad, Telangana
Remote
Title: Data Scientist / Data Engineer / Machine Learning Engineer Location: Hyderabad / Vizag / Remote Shift: UK Shift – 2:00 PM to 11:00 PM IST Experience: 8+ years Immediate Joiners Preferred Job Summary We are seeking a highly skilled Data Scientist / Data Engineer / Machine Learning Engineer with strong expertise in Data Robot , Azure AI , Azure Infrastructure , Python , and Machine Learning . This role will be responsible for solving complex business problems using advanced statistical modeling, AI/ML algorithms, and big data technologies. Key Responsibilities Lead data mining and extraction activities; apply algorithms to derive actionable insights. Understand data profiles, limitations, and apply findings to business solutions. Collaborate with cross-functional teams to drive innovation and improve performance. Explore and analyze data patterns using modern data science tools. Translate business requirements into effective data science deliverables. Develop and implement robust, scalable data products and AI/ML models. Apply best practices in AI/ML projects, ensuring scalability and operational efficiency. Required Qualifications 8+ years of experience in Data Science, AI, and Machine Learning. Proven experience with Data Robot , Azure AI , and Azure Infrastructure . Strong skills in Python and Big Data technologies . Experience with end-to-end ML deployments and migrations . Proficiency in advanced SQL across multiple syntaxes. Bachelor’s degree in Statistics, Mathematics, Computer Science, or related field. Strong communication skills and ability to work independently. Expertise in statistical modeling, model evaluation, tuning, and scalability. Preferred Skills Ability to work in fast-paced environments with minimal supervision. Knowledge of software engineering principles. Strong analytical and problem-solving mindset. Apply Now Send your resume to: [email protected] Call us at: +91 96003 77933 Job Type: Full-time Work Location: In person
Posted 3 hours ago
5.0 years
0 Lacs
Thiruvananthapuram, Kerala, India
On-site
Required Qualifications & Skills: 5+ years in DevOps, SRE, or Infrastructure Engineering. Strong expertise in Azure Cloud & Infrastructure-as-Code (Terraform, CloudFormation). Proficient in Docker & Kubernetes. Hands-on with CI/CD tools & scripting (Bash, Python, or Go). Strong knowledge of Linux, networking, and security best practices. Experience with monitoring & logging tools (ELK, Prometheus, Grafana). Familiarity with GitOps, Helm charts & automation Key Responsibilities: Design & manage CI/CD pipelines (Jenkins, GitLab CI/CD, GitHub Actions). Automate infrastructure provisioning (Terraform, Ansible, Pulumi). Monitor & optimize cloud environments Implement containerization & orchestration (Docker, Kubernetes - EKS/GKE/AKS). Maintain logging, monitoring & alerting (ELK, Prometheus, Grafana, Datadog). Ensure system security, availability & performance tuning. Manage secrets & credentials (Vault, Secrets Manager). Troubleshoot infrastructure & deployment issues. Implement blue-green & canary deployments. Collaborate with developers to enhance system reliability & productivity Preferred Skills: Certification -Azure Devops Engineer Experience with multi-cloud, microservices, event-driven systems. Exposure to AI/ML pipelines & data engineering workflows.
Posted 3 hours ago
8.0 years
0 Lacs
Thiruvananthapuram, Kerala, India
On-site
Required Skills & Qualifications: 8+ years of experience in Product Management, preferably in AI, EdTech, SaaS, or related industries. Strong experience in full-stack web & mobile app development collaboration (Node.js, React.js, Flutter). Expertise in Agile methodologies (Scrum, Kanban, SAFe, etc.) and project management tools (JIRA, Git). Experience in cloud technologies (Azure) and DevOps best practices. Strong analytical mindset with proficiency in data-driven decision-making tools Excellent communication, stakeholder management, and leadership skills. Bachelor's or Master’s degree in Computer Science, AI, Business, or a related field Key Responsibilities: Product Strategy & Execution: Develop a scalable product roadmap that integrates AI and personalized learning models to improve user engagement. LLM & EdTech Innovation: Leverage Large Language Models (LLMs) to enhance learning experiences with AI-driven tutoring and adaptive learning. Agile Leadership: Implement Scrum/Kanban best practices to drive product execution and development efficiency. Full-Stack & Mobile Development Collaboration: Work closely with engineers on Node.js, React.js (web), Flutter (mobile), and backend architecture to ensure seamless feature integration. Cloud & Infrastructure Management: Partner with DevOps teams to optimize cloud solutions on Azure, GCP, and AWS for scalability and performance. Data-Driven Personalization: Utilize AI-driven analytics to create personalized learning experiences for students. Project Management & DevOps: Oversee product workflows using JIRA, Git, and CI/CD pipelines to ensure smooth deployment and feature releases. User Research & Feedback Loop: Analyze customer feedback, conduct A/B testing, and use data-driven insights to optimize product performance. Go-To-Market Strategy: Lead product launches and collaborate with marketing teams to drive adoption and user growth. . Preferred Qualifications: Experience in AI-powered learning systems, LMS, or adaptive learning platforms. Exposure to DevOps, CI/CD pipelines, and cloud-based deployment strategies. Experience in fast-paced startup environments with a strong innovation mindset.
Posted 3 hours ago
5.0 years
0 Lacs
Kerala, India
On-site
Description: We are seeking an experienced and versatile DevOps Engineer with a strong background in both AWS and Azure environments. The ideal candidate will have hands-on experience with CI/CD pipelines, Kubernetes, Linux systems, monitoring/logging tools, and Infrastructure as Code (IaC) technologies. Key Responsibilities: • Design, implement, and manage scalable, secure cloud infrastructure across AWS and Azure • Build and maintain CI/CD pipelines using tools such as Jenkins, GitHub Actions, Azure DevOps, or similar • Deploy, monitor, and manage containerized applications using Kubernetes • Implement and maintain logging, monitoring, and alerting solutions (e.g., Prometheus, Grafana, ELK, CloudWatch) • Automate infrastructure provisioning using IaC tools like Terraform, ARM, or CloudFormation • Collaborate with development and operations teams to ensure smooth and secure deployments • Troubleshoot infrastructure and deployment issues across environments Required Skills: • 5+ years of relevant experience in DevOps, SRE, or cloud engineering roles • Strong hands-on experience in both AWS and Azure • Proficiency in at least one CI/CD technology • Solid understanding of Kubernetes and container orchestration • Skilled in Linux system administration and scripting • Experience with one or more IaC tools (Terraform preferred) • Familiarity with logging and monitoring stacks (e.g., ELK, Prometheus, Azure Monitor)
Posted 3 hours ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Your contributions to organisation's growth: Maintain & develop data platforms based on Microsoft Fabric for Business Intelligence & Databricks for real-time data analytics .Design, implement and maintain standardized production-grade data pipelines using modern data transformation processes and workflows for SAP, MS Dynamics, on-premise or cloud .Develop an enterprise-scale cloud-based Data Lake for business intelligence solutions .Translate business and customer needs into data collection, preparation and processing requirements .Optimize the performance of algorithms developed by Data Scientists .General administration and monitoring of the data platforms . Competencie s:Working with structured & unstructured dat a.Experienced in various database technologies (RDBMS, OLAP, Timeseries, etc. ).Solid programming skills (Python, SQL, Scala is a plus ).Experience in Microsoft Fabric (incl. Warehouse, Lakehouse, Data Factory, DataFlow Gen2, Semantic Model) and/or Databricks (Spark ).Proficient in PowerB I.Experienced working with API s.Proficient in security best practice s.Data centered Azure know-how is a plus (Storage, Networking, Security, Billing ). Expertise you have to bring in along wi th;Bachelor or Master degree in business informatics, computer science, or equ al.A background in software engineering (e.g., agile programming, project organization) and experience with human centered design would be desirab le.Extensive experience in handling large data se ts.Experience working at least 5 years as a data engineer, preferably in an industrial compa ny.Analytical problem-solving skills and the ability to assimilate complex informati on.Programming experience in modern data-oriented languages (SQL, Pytho n).Experience with Apache Spark and DevO ps.Proven ability to synthesize complex data advanced technical skills related to data modelling, data mining, database design and performance tuni ng.English language proficien cy. Special requireme nts:High quality mindset paired with strong customer orientation, critical thinking, and attention to det ail.Understanding of data processing at s caleInfluence without author ity.Willingness to acquire additional system/technical knowledge as nee ded.Problem sol ver.Experience to work in an international organization and in multi-cultural te ams.Proactive, creative and innovat ive.
Posted 3 hours ago
5.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Company:A Large Global Organization Key Skills: Azure DevOps, Kubernetes, Terraform, Azure Cloud Roles & Responsibilities: Design and implement CI/CD pipelines using Azure DevOps. Manage and optimize cloud infrastructure on Azure. Collaborate with cross-functional teams to enhance application lifecycle management processes. Ensure the security and scalability of development and production environments. Troubleshoot and resolve issues related to cloud infrastructure and CI/CD processes. Experience Requirement: 5-10 years of experience in Azure DevOps and related tools. Proven expertise in designing and managing scalable CI/CD pipelines. Experience in managing infrastructure as code using tools such as Terraform. Strong troubleshooting skills with cloud infrastructure and deployment issues. Exposure to containerization technologies such as Kubernetes is a plus. Education: B.Tech M.Tech (Dual), B.Tech.
Posted 3 hours ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Project Role : Infrastructure Architect Project Role Description : Lead the definition, design and documentation of technical environments. Deploy solution architectures, conduct analysis of alternative architectures, create architectural standards, define processes to ensure conformance with standards, institute solution-testing criteria, define a solutions cost of ownership, and promote a clear and consistent business vision through technical architectures. Must have skills : Netapp Network Attached Storage (NAS) Administration Good to have skills : NA Minimum 2 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Infrastructure Architect, you will lead the definition, design, and documentation of technical environments. Your typical day will involve collaborating with various teams to deploy solution architectures, conducting analyses of alternative architectures, and creating architectural standards. You will also define processes to ensure conformance with these standards, institute solution-testing criteria, and promote a clear and consistent business vision through technical architectures. Your role will be pivotal in shaping the technical landscape of the organization, ensuring that all solutions align with business objectives and technical requirements. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Facilitate workshops and meetings to gather requirements and feedback from stakeholders. - Develop and maintain comprehensive documentation of architectural designs and standards. Professional & Technical Skills: - Daily Backup Management & Operations: Monitor and manage daily backup jobs, ensuring successful completion and addressing any issues promptly. - Restoration & Recovery: Execute data restoration processes, ensuring timely and accurate recovery of data as needed. - Cloud Subscription Management: Manage cloud subscriptions related to Rubrik, ensuring proper integration and cost optimization. - Backup Configuration: - Configure and manage backups for Windows virtual machines. - Configure Rubrik backup solutions for Azure virtual machines. - Configure and manage Rubrik backups for NAS shares. - Set up Rubrik backups for Active Directory (AD) domain controllers. - Hardware Installation: Install and configure Rubrik hardware appliances. - Vendor Collaboration: Coordinate with the Rubrik vendor for support, troubleshooting, and escalations. - Backup Failure Management: Identify, troubleshoot, and resolve backup failures, ensuring minimal impact on business operations. - Reporting: Generate and review daily backup reports from the Rubrik platform. - Secure Cloud Management: Manage Rubrik Secure Cloud (RSC) and ensure compliance with data security policies. - Rubrik Cloud Vault Management: Oversee and maintain Rubrik Cloud Vault, ensuring data integrity and availability. Education qualifications: - Proven experience with Rubrik backup solutions and technologies. - Strong understanding of backup and recovery processes, including cloud and on-premises environments. - Hands-on experience configuring backups for Windows and Azure virtual machines, NAS shares, and AD domain controllers. - Knowledge of Rubrik hardware installation and maintenance. - Familiarity with Rubrik Secure Cloud and Cloud Vault management. - Excellent troubleshooting and problem-solving skills. - Strong collaboration and communication abilities for vendor and stakeholder interaction Additional Information: - The candidate should have minimum 2 years of experience in Netapp Network Attached Storage (NAS) Administration. - This position is based at our Gurugram office. - A 15 years full time education is required.
Posted 3 hours ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Project Role : Infrastructure Architect Project Role Description : Lead the definition, design and documentation of technical environments. Deploy solution architectures, conduct analysis of alternative architectures, create architectural standards, define processes to ensure conformance with standards, institute solution-testing criteria, define a solutions cost of ownership, and promote a clear and consistent business vision through technical architectures. Must have skills : Netapp Network Attached Storage (NAS) Administration Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Job Description: Rubrik Backup Administrator Role Overview: We are seeking a skilled and experienced Rubrik Backup Administrator to manage and operate daily backup activities, ensure data protection and recovery, and collaborate with the Rubrik vendor. The ideal candidate will have extensive knowledge of Rubrik backup solutions, cloud subscriptions, and backup configurations across various environments. Key Responsibilities: Daily Backup Management & Operations: Monitor and manage daily backup jobs, ensuring successful completion and addressing any issues promptly. Restoration & Recovery: Execute data restoration processes, ensuring timely and accurate recovery of data as needed. Cloud Subscription Management: Manage cloud subscriptions related to Rubrik, ensuring proper integration and cost optimization. Backup Configuration: o Configure and manage backups for Windows virtual machines. o Configure Rubrik backup solutions for Azure virtual machines. o Configure and manage Rubrik backups for NAS shares. o Set up Rubrik backups for Active Directory (AD) domain controllers. Hardware Installation: Install and configure Rubrik hardware appliances. Vendor Collaboration: Coordinate with the Rubrik vendor for support, troubleshooting, and escalations. Backup Failure Management: Identify, troubleshoot, and resolve backup failures, ensuring minimal impact on business operations. Reporting: Generate and review daily backup reports from the Rubrik platform. Secure Cloud Management: Manage Rubrik Secure Cloud (RSC) and ensure compliance with data security policies. Rubrik Cloud Vault Management: Oversee and maintain Rubrik Cloud Vault, ensuring data integrity and availability. Qualifications: Proven experience with Rubrik backup solutions and technologies. Strong understanding of backup and recovery processes, including cloud and on-premises environments. Hands-on experience configuring backups for Windows and Azure virtual machines, NAS shares, and AD domain controllers. Knowledge of Rubrik hardware installation and maintenance. Familiarity with Rubrik Secure Cloud and Cloud Vault management. Excellent troubleshooting and problem-solving skills. Strong collaboration and communication abilities for vendor and stakeholder interaction. Preferred Skills: Certifications related to Rubrik or data backup technologies. Experience working in enterprise-level IT environments. Knowledge of cloud platforms like Azure and AWS. Work Environment: Hybrid work model. Collaboration with cross-functional teams and external vendors. Additional Information: - This position is based at our Gurugram office. - A 15 years full-time education is required.
Posted 3 hours ago
4.0 - 6.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
Experience Required: 4-6 years Location: Gurgaon Department: Product and Engineering Working Days: Alternate Saturdays Working (1st and 3rd) 🔧 Key Responsibilities Design, implement, and maintain highly available and scalable infrastructure using AWS Cloud Services. Build and manage Kubernetes clusters (EKS, self-managed) to ensure reliable deployment and scaling of microservices. Develop Infrastructure-as-Code using Terraform, ensuring modular, reusable, and secure provisioning. Containerize applications and optimize Docker images for performance and security. Ensure CI/CD pipelines (Jenkins, GitHub Actions, etc.) are optimized for fast and secure deployments. Drive SRE principles including monitoring, alerting, SLIs/SLOs, and incident response. Set up and manage observability tools (Prometheus, Grafana, ELK, Datadog, etc.). Automate routine tasks with scripting languages (Python, Bash, etc.). Lead capacity planning, auto-scaling, and cost optimization efforts across cloud infrastructure. Collaborate closely with development teams to enable DevSecOps best practices. Participate in on-call rotations, handle outages with calm, and conduct postmortems. 🧰 Must-Have Technical Skills Kubernetes (EKS, Helm, Operators) Docker & Docker Compose Terraform (modular, state management, remote backends) AWS (EC2, VPC, S3, RDS, IAM, CloudWatch, ECS/EKS) Linux system administration CI/CD pipelines (Jenkins, GitLab CI, GitHub Actions) Logging & monitoring tools: ELK, Prometheus, Grafana, CloudWatch Site Reliability Engineering practices Load balancing, autoscaling, and HA architectures 💡 Good-To-Have GCP or Azure exposure Service Mesh (Istio, Linkerd) Secrets management (Vault, AWS Secrets Manager) Security hardening of containers and infrastructure Chaos engineering exposure Knowledge of networking (DNS, firewalls, VPNs) 👤 Soft Skills Strong problem-solving attitude; calm under pressure Good documentation and communication skills Ownership mindset with a drive to automate everything Collaborative and proactive with cross-functional teams
Posted 3 hours ago
170.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Country/Region: IN Requisition ID: 28533 Work Model: Position Type: Salary Range: Location: INDIA - BENGALURU - BIRLASOFT OFFICE Title: Enterprise Architect Description: Area(s) of responsibility About Birlasoft: Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. Job Summary: We are looking for an Enterprise Architect someone wih 20+ years of experience in architecting, deploying, optimizing, scalable & reliable applications in Azure cloud by following the well architected framework and lead end-to-end application modernization initiatives, transforming monolithic Java/J2EE applications into scalable microservices based architecture by utilizing Domain driven designs Job Title – Enterprise Architect Location: Noida/Mumbai/Pune/Bangalore Job Overview Responsibilities Required Skills & Experience: 20+ years of experience in software architecture and enterprise-level solution design. Proven expertise in Java, J2EE, Struts, JSP, and modern Java frameworks. Strong experience in microservices architecture, containerization (Docker, Kubernetes), and API management. Hands-on experience with Azure cloud services, including compute, storage, networking, and database services. Deep understanding of DevSecOps practices and tools integration(Azure DevOps, Jenkins, Sonar and other related tools). Experience with CAST and vFunction for application analysis and modernization. Strong background in database migration, especially from Oracle ExaCC to PostgreSQL or other cloud-native databases. Excellent communication and stakeholder management skills. Ability to lead and influence technical teams and business stakeholders. Preferred Qualifications: Azure Solutions Architect Expert certification. Experience in BFSI industry is a plus. Familiarity with Agile and SAFe methodologies
Posted 3 hours ago
12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Us: RocketFrog.ai is an AI Studio for Business, delivering cutting-edge AI solutions in Healthcare, Pharma, BFSI, Hi-Tech, and Consumer Services industries. We specialize in Machine Learning, Deep Learning, and AI-driven Product Development to create real business impact. 🚀 Ready to take a Rocket Leap with Science? Role Overview: We are seeking a Solution Architect with deep experience in Agentic AI systems , a strong foundation in Machine Learning , and a proven track record of owning and delivering production-grade AI solutions. This role requires strategic architectural thinking, delivery leadership, and the ability to engage in both pre-sales solutioning and end-to-end execution for client projects. Key Responsibilities: Lead end-to-end ownership of AI solution delivery for a client project (typically involving 2 to 3 scrum teams) — from requirement gathering and architecture to deployment and monitoring Architect and deliver enterprise-grade Agentic AI systems across reasoning, orchestration, automation, and backend integration layers Drive solution architecture during pre-sales , working closely with clients and sales teams to define compelling and feasible technical proposals Make critical decisions on technology stacks, frameworks, design patterns, and system architecture strategies Connect the dots across multiple experimental threads and initiatives — simplifying complexity and aligning all stakeholders to the architectural vision Define architectural boundaries , APIs, messaging layers, and data schemas using modern design principles Ensure production readiness with observability, performance tuning, explainability, and fault tolerance built-in Mentor engineering teams , guiding implementation, reviewing code, and ensuring architectural integrity Required Skills & Expertise: Deep understanding of Agentic AI concepts : cognitive agents, task planning, multi-agent orchestration (MCP), agent-to-agent (A2A) communication Working knowledge of Amazon Bedrock (including Bedrock Flows, AgentCore) or corresponding equivalents from Azure such as Azure Logic Apps and Azure AI Foundry Strong knowledge of LLM frameworks and ecosystems : LangChain, HuggingFace, OpenAI APIs, RAG pipelines Strong machine learning fundamentals : model lifecycle, integration, feature engineering, and evaluation Hands-on experience with CI/CD pipelines , containerization (Docker), orchestration (Kubernetes), and infrastructure as code Experience with system architecture : microservices, REST/gRPC APIs, messaging queues, event-driven design Strong programming skills in Python , with exposure to FastAPI, Celery, Redis, Kafka Excellent communication and leadership skills to work across engineering, product, and business functions Preferred Qualifications: Full-stack development experience with modern frontend/backend frameworks Familiarity with enterprise ERP/CRM platforms like Salesforce, Workday, or Microsoft Dynamics Exposure to Responsible AI , explainability (SHAP/LIME), and model observability (Prometheus, WhyLabs) Domain experience in Healthcare, Pharma, BFSI, or Consumer Services Required Background: Bachelor’s or Master’s in Computer Science, AI, or a related technical discipline 7–12 years of total experience, including 2–3+ years in AI/ML solution architecture or delivery leadership Proven delivery of at least one end-to-end production-grade AI or intelligent automation solution Why Join RocketFrog.ai?: Architect next-gen Agentic AI solutions for real-world impact Collaborate with elite AI talent and visionary clients Lead innovation that blends research, engineering, and enterprise value 👉 If you're ready to shape the future of intelligent systems — let’s connect!
Posted 3 hours ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Project Role : Infrastructure Engineer Project Role Description : Assist in defining requirements, designing and building data center technology components and testing efforts. Must have skills : Linux, Microsoft Windows Server Administration, SUSE Linux Administration, Red Hat OS Administration Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Infrastructure Engineer, you will assist in defining requirements, designing and building data center technology components, and testing efforts. Your typical day will involve collaborating with team members, contributing to key decisions, and providing solutions to problems across multiple teams. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Implement new technologies to enhance infrastructure efficiency - Conduct regular system audits to ensure compliance and security - Troubleshoot and resolve infrastructure issues in a timely manner Professional & Technical Skills: - Must To Have Skills: Proficiency in Linux, Microsoft Windows Server Administration, SUSE Linux Administration, Red Hat OS Administration - Strong understanding of network protocols and configurations - Experience in virtualization technologies such as VMware or Hyper-V - Knowledge of cloud computing platforms like AWS or Azure - Ability to automate tasks using scripting languages like Python or PowerShell Additional Information: - The candidate should have a minimum of 5 years of experience in Linux. - This position is based at our Noida office. - A 15 years full-time education is required.
Posted 3 hours ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Administration of One Identity tool and management of integrated Identities and Services. Engineering support of One Identity Manager Environment Management of cloud and on-prem infrastructures hosting IAM. On boarding of Company Organizations to IAM Infrastructure. Understanding of the whole IAM environment, Active Directory Multi forest environment at an enterprise level, Windows OS, IIS, MS SQL server Monitor, Report and Analysis of bugs during and after IAM release versions. Performance management of IAM tools, database and Infrastructure. Administration of Identities and Services integrated with the One IDM tool. Support for Organization integration with the IAM Infra. Collaborate and work with onshore development and project team to provide solutions and assist during Project release, testing and for operational support. Responsible for management of incident, problem and change within the IAM Infrastructure. Responsible for documentation and update of IAM Processes and operating procedures. Work with Software Development tool (e.g., JIRA) and handle various IAM related tasks. Experience: 5 or more years in Enterprise IT with core focus on IAM Technologies line One Identity or similar IAM tools. Qualifications: Graduate Degree in computer science, information technology, or similar field Certification in security domain is a plus. Cloud certifications and knowledge in Azure, AWS is an advantage. Microsoft certifications on designing infra solutions, administering, and managing server and cloud infra is an advantage. Understanding of Agile and similar industry standards. Technical • Experience in One Identity tool (preferred) operations or similar IAM tools. • Knowledge of Windows server technologies. • Knowledge of Microsoft Active Directory • Knowledge in DNS, TCP/IP, network technologies • Knowledge in MS-SQL (single and cluster configuration) – database technologies. • Knowledge of incident, problem, change process handling Functional / Domain Experience in IAM solutions with strong knowledge of IAM concepts and understanding of security, risks,and governance.
Posted 4 hours ago
0 years
0 Lacs
Bengaluru, Karnataka
On-site
KEY RESPONSIBILITIES Conduit between user needs, technical capabilities and engineering requirements, developing these into roadmaps and requirements for the Agile backlog of a development team. Makes tough decisions on trade-offs between features, quality, and dates in order to provide the best solution for stakeholders and customers. Owns all communications, issue resolutions, priority decisions and roadmap discussions with key stakeholders. Partners with the business teams and customers during the creation of their product strategies, product visions and product goals. Creates prioritized technical roadmaps and requirements that aligns with the product vision and goals. Supports the measurement of product benefits to determine if product goals are achieved. • Primary liaison between business stakeholders and technical teams. • Communicates the product vision and goals to technical teams to ensure clear understanding of goals. Ensures that products being built bring value and meet the product goals. Collaborates with stakeholders on priorities and requirements. Owns all aspects of the product backlog: Priorities, backlog refinement, iteration planning, breaks down epics and features into manageable user stories, creates acceptance criteria and manages story acceptance, holds technology demos, etc. Leads team through technology transformation efforts which includes the creation of micro services and APIs for Cloud migration initiatives. Owns the communication of build progress and status to stakeholders and leaders. Escalates risks as needed. Provides leadership with the information needed for them to be able to help resolve team issues and risks. Attends customer meetings as required in order to understand customer needs and create products that drive value and deliver benefits. Provides consultative sessions to customers and vendors in partnership with Product Managers as a recognized Subject Matter Expert in regard to their assigned product portfolio and direction. • Takes an informal mentorship role with less experienced Product Owners as needed. • Energizes team to accomplish goals. PRIMARY RESULTS ACCOUNTABLE FOR ACHIEVING Maintains an accurate and prioritized product roadmap and backlog. Ensures technology engineering teams are working on the most important product features. Aligns product development efforts with the larger product strategy and vision. Creates value-based user stories that explain what users are doing and why to ensure proper business context drives appropriate product designs. Validates that the enhancements that have been rolled out have met the needs of our customers and vendors per feedback from the Product Manager. Maintains software/tools used by product managers, product owners and technical teams to efficiently manage work. Works directly with internal stakeholders, providing clearly guided AGILE management, influencing SDLC process, maintaining product management software, ensuring well groomed/prioritized backlog management, high quality product delivery, sharing of knowledge and a willingness to always undertake any team initiative. Possesses a willingness to empower as well as encourages commitments and positive attitudes in the team. JOB REQUIREMENTS Education & Certifications: Education or job knowledge equivalent to college or university undergraduate education. Preferred Certifications: SAFe Certifications Product Manager/Product Owner Certifications - Pragmatic Certified, Certified Product Owner (CPO or CSPO), Professional Scrum Product Owner (PSPO), etc. Cloud Certifications (Microsoft Azure preferred)
Posted 4 hours ago
25.0 years
30 - 50 Lacs
Pune, Maharashtra, India
On-site
Experience: 15–25 Years Locations: Hyderabad, Pune, Bangalore Work Mode: Hybrid Role Overview We are hiring a Director / Associate Director – Data Engineering to lead the architecture, delivery, and team management across multiple data engineering engagements. The ideal candidate will bring deep expertise in designing modern data platforms, building scalable data pipelines, and leading teams across technologies and cloud environments (AWS, Azure, or GCP). This is a client-facing leadership role focused on delivering enterprise-scale data engineering solutions and growing high-performing teams. Key Responsibilities Lead end-to-end data engineering strategy and solution delivery for multiple projects. Collaborate with enterprise clients to define data architectures, platform modernization strategies, and transformation roadmaps. Drive the design and implementation of cloud-native data pipelines, data lakes, and data warehouses. Manage project budgets, resourcing, timelines, and stakeholder communications. Build, lead, and mentor high-performing data engineering teams across cloud platforms. Define and enforce engineering standards, governance, and best practices. Ensure quality, scalability, and performance across all data solutions. Required Skills & Qualifications Experience & Leadership: 10–15 years in data engineering or data platform development, including at least 3+ years in director-level or senior leadership roles. Proven success in delivering large-scale, cloud-based data solutions. Strong people leadership, client management, and delivery ownership experience. Data Engineering & Tools Strong experience building ETL/ELT pipelines using tools like Spark, PySpark, Python, SQL, Airflow, dbt, etc. Solid knowledge of data lake/lakehouse/warehouse architectures. Experience with batch and real-time streaming data processing. Strong data modeling (star/snowflake schema), data partitioning, and performance tuning. Cloud & Platform Expertise Experience with any major cloud provider – AWS, Azure, or Google Cloud Platform (GCP). Deep understanding of cloud-native data architecture, including storage, compute, networking, and security considerations. Hands-on experience with services like: AWS: Redshift, Glue, S3, Lambda, EMR, Kinesis Azure: Data Lake, Synapse, ADF, Databricks, Event Hub GCP: BigQuery, Dataflow, Pub/Sub, Composer Governance & Quality Knowledge of data cataloging, data quality, lineage, and governance frameworks. Familiarity with security and compliance requirements (e.g., GDPR, HIPAA) in data platforms. Preferred Qualifications Certifications in cloud platforms (AWS, Azure, or GCP). Experience with CI/CD pipelines, DevOps for data, and infrastructure as code (Terraform/CloudFormation). Exposure to analytics and BI tools (Power BI, Tableau, Looker). Familiarity with data mesh, data fabric, or modern data architecture patterns. Skills: azure,spark,data quality,data warehouses,terraform,data governance,airflow,python,ci/cd,elt,architecture,dbt,leadership,data engineering,data modeling,data architecture,cloudformation,pyspark,cloud platforms,data lakes,sql,gcp,etl,pipelines,devops,cloud,aws
Posted 4 hours ago
15.0 years
25 - 50 Lacs
Pune, Maharashtra, India
On-site
Role: Director / Associate Director - Cloud Engineering (Azure Stack) Experience: 10–15 Years Locations: Hyderabad, Pune, Bangalore Work Mode: Hybrid Role Overview We are seeking a Director / Associate Director – Cloud Engineering (Azure Stack) with a strong specialization in Databricks to lead multiple client engagements, drive end-to-end project delivery, and manage high-performing engineering teams. This strategic leadership role requires hands-on technical expertise, excellent project and people management skills, and a track record of delivering large-scale data engineering solutions in cloud-native environments. Responsibilities 14+ years in data engineering, including 3+ years in leadership/director-level roles Proven experience with Databricks, Delta Lake, and cloud data architecture Strong track record of project delivery, team management, and client success Excellent communication and leadership skills in fast-paced environments Oversee and deliver multiple Databricks-based data engineering projects Manage project budgets, costing, staffing, and client expectations Lead and mentor engineering teams across engagements Collaborate with clients on architecture, strategy, governance, and reporting Ensure high-quality delivery aligned with best practices and business value Required Skills Strong hands-on experience with Databricks Deep hands-on experience with Azure data services Strong background in building scalable ETL/ELT pipelines Proven experience in designing data warehouses or lakehouses Hands-on experience implementing Unity Catalog Ability to work with business teams to build dashboards Qualifications Databricks – Full-platform expertise for scalable data solutions: Strong hands-on experience with Databricks for building and managing ETL pipelines, Delta Lake, notebooks, and job orchestration. Skilled in cluster optimization, workspace management, and integrating Databricks with Azure services. Cloud – Azure (preferred), or similar cloud environments: Deep hands-on experience with Azure data services such as Azure Data Lake, Azure Synapse, Azure Data Factory, and integration with Databricks. Ability to design and deploy cloud-native data architectures. Data Engineering – Spark, PySpark, and Python for scalable data processing: Strong background in building scalable, high-performance ETL/ELT pipelines using Spark and PySpark. Ability to write optimized, production-grade Python code for data transformation, orchestration, and automation in distributed environments. Data Warehousing & SQL – Designing and querying enterprise data models: Proven experience in designing data warehouses or lakehouses, dimensional modeling, and writing complex SQL queries for analytics and reporting. Governance – Implementation and management of Unity Catalog: Hands-on experience implementing Unity Catalog for managing metadata, access control, and data lineage in Databricks. Reporting Tools – Power BI or similar (Tableau, Looker, etc.) Ability to work with business teams to build insightful dashboards and visualizations using Power BI. Skills: cloud engineering,azure,delta lake,data engineering,sql,python,cloud architecture,elt,unity catalog,power bi,etl,spark,directors,databricks,pyspark,azure data services,data warehouse,leadership,reporting tools
Posted 4 hours ago
4.0 years
0 Lacs
Greater Ahmedabad Area
On-site
Position: Data Engineer Location: Ahmedabad (On-site at office) Experience: 4+ years of relevant experience Job Description: We are looking for a skilled Data Engineer responsible for building ETLs, data pipelines, and setting automation schedules for data collection. The ideal candidate will have expertise in Python for applying transformation techniques, and proficiency in database management systems such as SQL, MySQL, and MongoDB. Experience with GCP cloud services, including BigQuery and Cloud Functions, is essential. Responsibilities: Assemble large, complex datasets that meet functional and non-functional business requirements. Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. Build the necessary infrastructure for optimal extraction, transformation, and loading (ETL) of data from various sources using GCP/Azure and SQL technologies. Develop analytical tools to utilize the data pipeline, providing actionable insights into key business performance metrics such as operational efficiency and customer acquisition. Collaborate with stakeholders (data, design, product, and executive teams) to address data-related technical issues. Understand visualization requirements and translate them into actionable data pipelines. Qualifications and Skills: Strong expertise in SQL and Python. Knowledge of both relational (e.g., SQL, BigQuery) and non-relational (e.g., MongoDB) database management systems. Experience with GCP cloud services, including BigQuery and Cloud Functions.
Posted 4 hours ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Data Engineering – Snowflake -Staff Position EY’s GDS Tax Technology team’s mission is to develop, implement and integrate technology solutions that better serve our clients and engagement teams. As a member of EY’s core Tax practice, you’ll develop a deep tax technical knowledge and outstanding database, data analytics and programming skills. Ever-increasing regulations require tax departments to gather, organize and analyse more data than ever before. Often the data necessary to satisfy these ever-increasing and complex regulations must be collected from a variety of systems and departments throughout an organization. Effectively and efficiently handling the variety and volume of data is often extremely challenging and time consuming for a company. EY's GDS Tax Technology team members work side-by-side with the firm's partners, clients and tax technical subject matter experts to develop and incorporate technology solutions that enhance value-add, improve efficiencies and enable our clients with disruptive and market leading tools supporting Tax. GDS Tax Technology works closely with clients and professionals in the following areas: Federal Business Tax Services, Partnership Compliance, Corporate Compliance, Indirect Tax Services, Human Capital, and Internal Tax Services. GDS Tax Technology provides solution architecture, application development, testing and maintenance support to the global TAX service line both on a pro-active basis and in response to specific requests. EY is currently seeking a Data Engineer - Staff to join our Tax Technology practice in India. Key Skillset (Must Have) Hands-on experience in expertise in developing, and implementing data solutions, data warehouse using Snowflake Implement data integration solutions using Snowflake, including data ingestion from various sources. Experience on Snowpipe or Azure Data Factory or any other ETL Strong database programming/ backend development experience, writing complex queries, stored procedures, views, triggers, cursors, UDFs. Develops, maintains, and optimizes all data layer components for new and existing systems, including databases, stored procedures, ETL packages, and SQL queries Experience on Azure data platform offerings. Interacts with application developers from various core systems to address changes in those systems and their impact on existing SQL databases Defines data solutions for business problems by working directly with project managers, seniors and clients Ability to effectively communicate with other senior leaders of IT on program strategies and plans and negotiate quality solutions Secondary Skillset (Good To Have) Azure SQL, Azur Datafactory Databricks Python or Pyspark Any NoSQL Database Qualification & Experience Required Strong verbal and written communications skills Ability to work as an individual contributor. 2+ years of combined experience using Azure SQL or any other databases building database applications Experience on Snowpipe or Azure Data Factory or SSIS or any other ETL tools Exposure to visual analytics tool (Power BI) would be good to have but not mandatory EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 4 hours ago
7.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Profile: MDM Developer Key Responsibilities: Design, implement, and maintain MDM solutions (Informatica MDM SaaS, IDMC, IICS). Support business and data stewards in daily tool usage and enforce MDM rules/standards. Integrate MDM with SAP, non-SAP, and legacy systems. Design and manage data flows, APIs, and system integrations. Monitor, troubleshoot, and improve MDM performance. Lead data quality initiatives and correction efforts. Develop technical documentation, architecture diagrams, and best practices. Mentor junior team members and promote knowledge sharing. Required Skills & Experience 7+ years experience with Informatica MDM, including: 3+ years in MDM SaaS / IDMC / IICS 3+ years with Business 360, Reference 360, Data Quality, Multidomain MDM, Data Integration 3+ years with CAI, CDQ, CDI 3+ years designing UI/UX for Multidomain MDM 3+ years working with APIs (hosting & consuming) Strong Java & SQL skills for custom scripts and queries. Good Linux skills for on-prem configuration management. Hands-on integration experience with SAP, Oracle, Azure, and other platforms. Understanding of Informatica DaaS offerings. Please share your resume at monika.soni@celebaltech.com
Posted 4 hours ago
2.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Data Engineering – Snowflake -Staff Position EY’s GDS Tax Technology team’s mission is to develop, implement and integrate technology solutions that better serve our clients and engagement teams. As a member of EY’s core Tax practice, you’ll develop a deep tax technical knowledge and outstanding database, data analytics and programming skills. Ever-increasing regulations require tax departments to gather, organize and analyse more data than ever before. Often the data necessary to satisfy these ever-increasing and complex regulations must be collected from a variety of systems and departments throughout an organization. Effectively and efficiently handling the variety and volume of data is often extremely challenging and time consuming for a company. EY's GDS Tax Technology team members work side-by-side with the firm's partners, clients and tax technical subject matter experts to develop and incorporate technology solutions that enhance value-add, improve efficiencies and enable our clients with disruptive and market leading tools supporting Tax. GDS Tax Technology works closely with clients and professionals in the following areas: Federal Business Tax Services, Partnership Compliance, Corporate Compliance, Indirect Tax Services, Human Capital, and Internal Tax Services. GDS Tax Technology provides solution architecture, application development, testing and maintenance support to the global TAX service line both on a pro-active basis and in response to specific requests. EY is currently seeking a Data Engineer - Staff to join our Tax Technology practice in India. Key Skillset (Must Have) Hands-on experience in expertise in developing, and implementing data solutions, data warehouse using Snowflake Implement data integration solutions using Snowflake, including data ingestion from various sources. Experience on Snowpipe or Azure Data Factory or any other ETL Strong database programming/ backend development experience, writing complex queries, stored procedures, views, triggers, cursors, UDFs. Develops, maintains, and optimizes all data layer components for new and existing systems, including databases, stored procedures, ETL packages, and SQL queries Experience on Azure data platform offerings. Interacts with application developers from various core systems to address changes in those systems and their impact on existing SQL databases Defines data solutions for business problems by working directly with project managers, seniors and clients Ability to effectively communicate with other senior leaders of IT on program strategies and plans and negotiate quality solutions Secondary Skillset (Good To Have) Azure SQL, Azur Datafactory Databricks Python or Pyspark Any NoSQL Database Qualification & Experience Required Strong verbal and written communications skills Ability to work as an individual contributor. 2+ years of combined experience using Azure SQL or any other databases building database applications Experience on Snowpipe or Azure Data Factory or SSIS or any other ETL tools Exposure to visual analytics tool (Power BI) would be good to have but not mandatory EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 4 hours ago
3.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Sr. Executive / Assistant Manager Data Engineer Godrej Agrovet Mumbai, Maharashtra, India ------------------------------------------------------------------------------------------------------------- Job Title: Sr. Executive / Assistant Manager Data Engineer Job Type: Permanent, Full-time Function: Digital Business: Godrej Agrovet Location: Mumbai, Maharashtra, India About Godrej Agrovet: Godrej Agrovet Limited (GAVL) is a diversified, Research & Development focused agri-business Company dedicated to improving the productivity of Indian farmers by innovating products and services that sustainably increase crop and livestock yields. GAVL holds leading market positions in the different businesses it operates - Animal Feed, Crop Protection, Oil Palm, Dairy, Poultry and Processed Foods. GAVL has a pan India presence with sales of over a million tons annually of high-quality animal feed and cutting- edge nutrition products for cattle, poultry, aqua feed and specialty feed. Our teams have worked closely with Indian farmers to develop large Oil Palm Plantations which is helping in bridging the demand and supply gap of edible oil in India. In the crop protection segment, the company meets the niche requirement of farmers through innovative agrochemical offerings. GAVL through its subsidiary Astec Life Sciences Limited, is also a business-to-business (B2B) focused bulk manufacturer of fungicides & herbicides. In Dairy and Poultry and Processed Foods, the company operates through its subsidiaries Creamline Dairy Products Limited and Godrej Tyson Foods Limited. Apart from this, GAVL also has a joint venture with the ACI group of Bangladesh for animal feed business in Bangladesh. For more information on the Company, please log on to www.godrejagrovet.com Roles & Responsibilities: Data Pipeline Development : Design, develop, and optimize data pipelines to ingest, process, and transform data from various sources (e.g., APIs, databases, files) into the data warehouse. Data Integration: Integrate data from various structured and unstructured sources into the Databricks Lakehouse environment, ensuring data accuracy and reliability. Data Lakehouse & storage Management: Design and maintain data warehouse solutions using medallion architecture practices, optimizing storage, cloud utilization, costs and query performance. Collaboration with Data Teams : Work closely with data scientists, analysts, to understand requirements, translate them into technical solutions, and implement data solutions. Data Quality and Monitoring : Cleanse, transform, and enrich data. Implement data quality checks and establish monitoring processes to ensure data integrity and accuracy. Implement monitoring for data pipelines and troubleshoot any issues or failures promptly to ensure data reliability. Optimization and Performance Tuning: Optimize data processing workflows for performance, reliability, and scalability, including tuning spark jobs, caching, and partitioning data appropriately. Data Security and Privacy: Manage and organize data lakes using Unity catalog, ensuring proper governance, security, role-based access and compliance with data management policies. Key Skills: Technical Skills: Proficiency with Databricks Lakehouse platform, Delta Lake, Genie, ML Flow (e.g., Databricks Certified Data Engineer Associate) is a plus. SQL and NoSQL: Experienced working with both SQL and NoSQL data sources (e.g., MySQL, PostgreSQL, MongoDB etc.) Strong knowledge of Spark, especially in PySpark or Scala, for data transformation. Proficiency in Python, R and other programming languages used in data processing. Experience with cloud platforms like Azure, AWS, particularly Azure storage & services Knowledge of ML Pipelines, data streaming platforms (e.g., Apache Kafka, AWS Kinesis). Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker) Educational Qualification: Education: Bachelor’s degree in Computer Science, Engineering/MCA, or a related field (Master’s preferred) 3+ years of experience as a Data Engineer, with hands-on experience in Databricks Experience: Seeking a skilled Data Engineer with expertise in Databricks on Azure (ADF, ADLS) to join our data team. As a Data Engineer, you will work with both structured and unstructured data. You will be responsible for designing, building, and maintaining scalable and reliable data pipelines that support business intelligence, data analytics, and machine learning efforts. You will collaborate closely with data scientists, analysts, and cross-functional teams to ensure data is available, accurate, and optimized for processing and storage. What’s in it for you? Be an equal parent Maternity support, including paid leave ahead of statutory guidelines, and flexible work options on return Paternity support, including paid leave New mothers can bring a caregiver and children under a year old, on work travel Adoption support; gender neutral and based on the primary caregiver, with paid leave options No place for discrimination at Godrej Gender-neutral anti-harassment policy Same sex partner benefits at par with married spouses Gender transition support We are selfish about your wellness Comprehensive health insurance plans, as well as accident coverage for you and your family, with top-up options Uncapped sick leave Mental wellness and self-care programmes, resources and counselling Celebrating wins, the Godrej Way Structured recognition platforms for individual, team and business-level achievements Performance-based earning opportunities https://www.godrejcareers.com/benefits/ An inclusive Godrej Before you go, there is something important we want to highlight. There is no place for discrimination at Godrej. Diversity is the philosophy of who we are as a company. And has been for over a century. It’s not just in our DNA and nice to do. Being more diverse - especially having our team members reflect the diversity of our businesses and communities - helps us innovate better and grow faster. We hope this resonates with you. We take pride in being an equal opportunities employer. We recognise merit and encourage diversity. We do not tolerate any form of discrimination on the basis of nationality, race, colour, religion, caste, gender identity or expression, sexual orientation, disability, age, or marital status and ensure equal opportunities for all our team members. If this sounds like a role for you, apply now! We look forward to meeting you.
Posted 4 hours ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Data Engineering – Snowflake -Staff Position EY’s GDS Tax Technology team’s mission is to develop, implement and integrate technology solutions that better serve our clients and engagement teams. As a member of EY’s core Tax practice, you’ll develop a deep tax technical knowledge and outstanding database, data analytics and programming skills. Ever-increasing regulations require tax departments to gather, organize and analyse more data than ever before. Often the data necessary to satisfy these ever-increasing and complex regulations must be collected from a variety of systems and departments throughout an organization. Effectively and efficiently handling the variety and volume of data is often extremely challenging and time consuming for a company. EY's GDS Tax Technology team members work side-by-side with the firm's partners, clients and tax technical subject matter experts to develop and incorporate technology solutions that enhance value-add, improve efficiencies and enable our clients with disruptive and market leading tools supporting Tax. GDS Tax Technology works closely with clients and professionals in the following areas: Federal Business Tax Services, Partnership Compliance, Corporate Compliance, Indirect Tax Services, Human Capital, and Internal Tax Services. GDS Tax Technology provides solution architecture, application development, testing and maintenance support to the global TAX service line both on a pro-active basis and in response to specific requests. EY is currently seeking a Data Engineer - Staff to join our Tax Technology practice in India. Key Skillset (Must Have) Hands-on experience in expertise in developing, and implementing data solutions, data warehouse using Snowflake Implement data integration solutions using Snowflake, including data ingestion from various sources. Experience on Snowpipe or Azure Data Factory or any other ETL Strong database programming/ backend development experience, writing complex queries, stored procedures, views, triggers, cursors, UDFs. Develops, maintains, and optimizes all data layer components for new and existing systems, including databases, stored procedures, ETL packages, and SQL queries Experience on Azure data platform offerings. Interacts with application developers from various core systems to address changes in those systems and their impact on existing SQL databases Defines data solutions for business problems by working directly with project managers, seniors and clients Ability to effectively communicate with other senior leaders of IT on program strategies and plans and negotiate quality solutions Secondary Skillset (Good To Have) Azure SQL, Azur Datafactory Databricks Python or Pyspark Any NoSQL Database Qualification & Experience Required Strong verbal and written communications skills Ability to work as an individual contributor. 2+ years of combined experience using Azure SQL or any other databases building database applications Experience on Snowpipe or Azure Data Factory or SSIS or any other ETL tools Exposure to visual analytics tool (Power BI) would be good to have but not mandatory EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 4 hours ago
5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Mandatory Skills 5+ years of proven experience as a Cloud Architect, with a specific focus on IAM in cloud environments. In-depth knowledge of IAM principles, including authentication, authorization, SSO, and privilege management. Proficiency in Azure Active Directory (AAD), Privileged Identity Management (PIM), Just-in-Time (JiT) access. Strong experience in Azure Infrastructure as Code (IaC) using Bicep, Terraform, or similar technologies. Optional Skills: Familiarity with DevOps practices and CI/CD pipelines in cloud environments. YAML process flows (Azure DevOps pipelines, Github actions or similar) Experience with best practices for Agile development in SAFe framework or similar Experience with GxP validation processes Experience with Pharma and understanding of the Clinical Data Domain Other related experience within software or cloud development Roles & Responsibilities: Design, develop, and implement Azure AD IAM solutions to ensure secure and efficient access management across cloud environments. Collaborate with cross-functional teams to integrate IAM best practices into cloud architecture designs and deployments. Assess and enhance existing IAM frameworks, policies, and procedures to align with industry standards and compliance requirements. Develop Azure Infrastructure as Code (bicep / terraform). Develop and maintain AAD PIM JiT IAM. Troubleshoot and resolve any Azure AD IAM related issues. Monitor and optimize Azure AD IAM performance. Provide guidance and best practices to other team members. Stay updated with emerging IAM technologies and industry trends, implementing innovative solutions where applicable. About 7N : Over decades, 7N has been part of several waves of digitalization. Today, our consultants work across industries and geographical borders to deliver the projects that define the new digital realities. We offer a highly specialized portfolio of IT services and solutions delivered by the top 3% of IT professionals. Our expertise spans across many industries, providing digital transformation across all phases of the IT project life cycle. By engaging early with 7N, you benefit from our experience and expertise when defining the project scope and strategic needs, and you gain flexibility to accommodate changing demands while maintaining control and ownership of IT development. What is in it for you ? An excellent opportunity to work on latest technologies and be amongst top 3% of technical consultants in your domain. Excellent health benefits. Best in industry salary structure - without any hidden deductions. An opportunity to experience work culture that provides - flexibility, Sensitivity, Growth and Respect. An opportunity get associated with a value driven organization.
Posted 4 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40183 Jobs | Dublin
Wipro
19422 Jobs | Bengaluru
Accenture in India
16540 Jobs | Dublin 2
EY
15533 Jobs | London
Uplers
11632 Jobs | Ahmedabad
Amazon
10667 Jobs | Seattle,WA
Oracle
9549 Jobs | Redwood City
IBM
9337 Jobs | Armonk
Accenture services Pvt Ltd
8190 Jobs |
Capgemini
7921 Jobs | Paris,France