Jobs
Interviews

10729 Apache Jobs - Page 17

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

About Media.net : Media.net is a leading, global ad tech company that focuses on creating the most transparent and efficient path for advertiser budgets to become publisher revenue. Our proprietary contextual technology is at the forefront of enhancing Programmatic buying, the latest industry standard in ad buying for digital platforms. The Media.net platform powers major global publishers and ad-tech businesses at scale across ad formats like display, video, mobile, native, as well as search. Media.net’s U.S. HQ is based in New York, and the Global HQ is in Dubai. With office locations and consultant partners across the world, Media.net takes pride in the value-add it offers to its 50+ demand and 21K+ publisher partners, in terms of both products and services. Responsibilities (What You’ll Do) Infrastructure Management: Oversee and maintain the infrastructure that supports the ad exchange applications. This includes load balancers, data stores, CI/CD pipelines, and monitoring stacks. Continuously improve infrastructure resilience, scalability, and efficiency to meet the demands of massive request volume and stringent latency requirements. Developing policies and procedures that improve overall platform stability and participate in shared On-call schedule Collaboration with Developers: Work closely with developers to establish and uphold quality and performance benchmarks, ensuring that applications meet necessary criteria before they are deployed to production. Participate in design reviews and provide feedback on infrastructure-related aspects to improve system performance and reliability. Building Tools for Infra Management: Develop tools to simplify and enhance infrastructure management, automate processes, and improve operational efficiency. These tools may address areas such as monitoring, alerting, deployment automation, and failure detection and recovery, which are critical in minimizing latency and maintaining uptime. Performance Optimization: Focus on reducing latency and maximizing efficiency across all components, from request handling in load balancers to database optimization. Implement best practices and tools for performance monitoring, including real-time analysis and response mechanisms. Who Should Apply B.Tech/M.Tech or equivalent in Computer Science, Information Technology, or a related field. 2–4 years of experience managing services in large-scale distributed systems. Strong understanding of networking concepts (e.g., TCP/IP, routing, SDN) and modern software architectures. Proficiency in programming and scripting languages such as Python, Go, or Ruby, with a focus on automation. Experience with container orchestration tools like Kubernetes and virtualization platforms (preferably GCP). Ability to independently own problem statements, manage priorities, and drive solutions. Preferred Skills & Tools Expertise: Infrastructure as Code: Experience with Terraform. Configuration management tools like Nix, Ansible. Monitoring and Logging Tools: Expertise with Prometheus, Grafana, or ELK stack. OLAP databases : Clickhouse and Apache druid. CI/CD Pipelines: Hands-on experience with Jenkins, or ArgoCD. Databases: Proficiency in MySQL (relational) or Redis (NoSQL). Load Balancers Servers: Familiarity with haproxy or Nginx. Strong knowledge of operating systems and networking fundamentals. Experience with version control systems such as Git.

Posted 4 days ago

Apply

0 years

0 Lacs

India

On-site

Job Description: We are seeking a highly skilled 4+ Azure Data Engineer to design, develop, and optimize data pipelines and data integration solutions in a cloud-based environment. The ideal candidate will have strong technical expertise in Azure, Data Engineering tools, and advanced ETL design along with excellent communication and problem-solving skills. Key Responsibilities: Design and develop advanced ETL pipelines for data ingestion and egress for batch data. Build scalable data solutions using Azure Data Factory (ADF) , Databricks , Spark (PySpark & Scala Spark) , and other Azure services. Troubleshoot data jobs, identify issues, and implement effective root cause solutions. Collaborate with stakeholders to gather requirements and propose efficient solution designs. Ensure data quality, reliability, and adherence to best practices in data engineering. Maintain detailed documentation of problem definitions, solutions, and architecture. Work independently with minimal supervision while ensuring project deadlines are met. Required Skills & Qualifications: Microsoft Certified: Azure Fundamentals (preferred). Microsoft Certified: Azure Data Engineer Associate (preferred). Proficiency in SQL , Python , and Scala . Strong knowledge of Azure Cloud services , ADF , and Databricks . Hands-on experience with Apache Spark (PySpark & Scala Spark). Expertise in designing and implementing complex ETL pipelines for batch data. Strong troubleshooting skills with the ability to perform root cause analysis. Soft Skills: Excellent verbal and written communication skills. Strong documentation skills for drafting problem definitions and solutions. Ability to effectively gather requirements and propose solution designs. Self-driven with the ability to work independently with minimal supervision.

Posted 4 days ago

Apply

1.0 - 2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

SLSQ326R415 Databricks is at the forefront of the Unified Data Analytics field, where innovation is key to providing our clients with a competitive edge in today's fast-paced business landscape. We are looking for a Business Development Representative to help drive revenue growth within the India Market. If you're a results-oriented sales professional with a track record in similar roles, aiming to contribute to the expansion of a transformative enterprise software company and propel your career, this role is for you. Reporting to the Manager of the India Sales Development team, you'll play a pivotal role in this journey. The Impact You Will Have Cultivate expertise in value-based selling, big data, and AI. Evaluate and prioritize the inbound leads from Marketing initiatives. Craft outbound strategies encompassing personalized emails, cold calls, and social selling to qualify opportunities. Devise compelling outreach campaigns targeting diverse buyer levels, including senior executives, to unlock opportunities in critical target accounts. Identify and uncover client requirements, progressing discussions into sales prospects by demonstrating how Databricks can address their data-related challenges. What We Look For Preferably a minimum of 1-2 years of prior experience in inbound and outbound sales and inquiries. Proficiency in comprehending technical concepts, coupled with genuine enthusiasm for technology. Determination and courage to excel and contribute to the growth of the next top-tier enterprise software company. Demonstrated a history of consistent, quantifiable achievements in previous roles. Curiosity and eagerness to continually learn and stay abreast of developments in the big data/AI sector. A strong sense of ownership and accountability. About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.

Posted 4 days ago

Apply

4.0 years

0 Lacs

Gwalior, Madhya Pradesh, India

On-site

*Lead and manage a team of developers, providing guidance, code reviews, and mentorship. *Architect, design, develop, and maintain web applications using PHP and relevant frameworks. *Collaborate with project managers, designers, and QA to deliver high-quality products on time. *Manage server configurations, deployments, backups, and ensure application security and performance. *Set coding standards and best practices, ensuring code quality and re-usability. *Troubleshoot and debug existing applications and identify areas for improvement. *Handle version control using Git and manage CI/CD pipelines.Monitor server health and ensure 24/7 uptime for critical web applications. Required Skills & Qualifications *Bachelor’s degree in Computer Science or related field (or equivalent experience). *4+ years of experience in PHP development and at least 2 years in a team lead or senior role. *Proficient in PHP, MySQL, and modern frameworks like Laravel, CodeIgniter, Symfony, Opencart etc. *Strong experience with REST APIs, AJAX, and third-party integrations.*Good knowledge of front-end technologies like HTML5, CSS3, JavaScript, jQuery, and Bootstrap. *Experience in server management (Linux, Apache/Nginx, VPS, cPanel, SSL, firewalls, etc.). *Knowledge about Domain Book, Renew and DNS Activities.*Familiarity with cloud platforms like AWS, Digital Ocean, or similar is a plus. *Proficient in Git, version control systems, and deployment tools. *Strong problem-solving skills and ability to work independently or in a team environment.

Posted 4 days ago

Apply

8.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Role Description We are looking for a seasoned Senior Backend Engineer with 8+ years of experience in Java-based backend development. The ideal candidate will have deep expertise in Java 11+, Spring Boot (v2.7+), and strong database fundamentals using Oracle DB. You will be instrumental in building scalable and maintainable backend systems and RESTful APIs. Candidates with additional experience in messaging systems like RabbitMQ, Apache ActiveMQ, or Azure Service Bus, and knowledge of integration patterns are preferred. Key Responsibilities Design, develop, and maintain robust backend services using Java 11+, Spring Boot 2.7+, and Oracle DB. Work with JDBC, Hibernate, HQL, and SQL to manage database transactions and data persistence. Build and consume RESTful APIs using JSON, XML, and YAML formats. Collaborate with cross-functional teams including QA, DevOps, and Product to deliver high-quality software. Utilize tools such as Git, Bitbucket, SourceTree, and Jenkins for source control and CI/CD workflows. Write clean, testable code using modern development practices and tools like IntelliJ or Eclipse IDE. Participate in Agile ceremonies, contribute to sprint planning, and document knowledge in JIRA and Confluence. Leverage Maven and Gradle for build automation and Tomcat for application deployment. Perform unit, integration, and API testing using tools such as Postman. Required Skills & Qualifications 8+ years of backend development experience using Java (version 11 or above). Strong hands-on experience with Spring Boot (v2.7 or higher). Proficiency with Oracle Database fundamentals, JDBC, Hibernate, and SQL/HQL. Familiarity with JSON, XML, and YAML data formats. Solid understanding of REST API design and development. Experience with Git, Bitbucket, Git Bash, and SourceTree. Practical knowledge of CI/CD tools including Jenkins, Maven, and Gradle. Experience working with IDEs such as IntelliJ IDEA or Eclipse. Familiarity with application servers like Apache Tomcat. Strong problem-solving skills and the ability to work in a fast-paced Agile environment. Salary: 90K-100K per month

Posted 4 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Hi Connections, Urgent - Hiring for below role About the Role: We are seeking a seasoned and highly skilled MLOps Engineer to join our growing team. The ideal candidate will have extensive hands-on experience with deploying, monitoring, and retraining machine learning models in production environments. You will be responsible for building and maintaining robust and scalable MLOps pipelines using tools like MLflow, Apache Airflow, Kubernetes, and Databricks or Azure ML. A strong understanding of infrastructure-as-code using Terraform is essential. You will play a key role in operationalizing AI/ML systems and ensuring high performance, availability, and automation across the ML lifecycle. --- Key Responsibilities: · Design and implement scalable MLOps pipelines for model training, validation, deployment, and monitoring. · Operationalize machine learning models using MLflow, Airflow, and containerized deployments via Kubernetes. · Automate and manage ML workflows across cloud platforms such as Azure ML or Databricks. · Develop infrastructure using Terraform for consistent and repeatable deployments. · Trace API calls to LLMs, Azure OCR and Paradigm · Implement performance monitoring, alerting, and logging for deployed models using custom and third-party tools. · Automate model retraining and continuous deployment pipelines based on data drift and model performance metrics. · Ensure traceability, reproducibility, and auditability of ML experiments and deployments. · Collaborate with Data Scientists, ML Engineers, and DevOps teams to streamline ML workflows. · Apply CI/CD practices and version control to the entire ML lifecycle. · Ensure secure, reliable, and compliant deployment of models in production environments. --- Required Qualifications: · 5+ years of experience in MLOps, DevOps, or ML engineering roles, with a focus on production ML systems. · Proven experience deploying machine learning models using MLflow and workflow orchestration with Apache Airflow. · Hands-on experience with Kubernetes for container orchestration in ML deployments. · Proficiency with Databricks and/or Azure ML, including model training and deployment capabilities. · Solid understanding and practical experience with Terraform for infrastructure-as-code. · Experience automating model monitoring and retraining processes based on data and model drift. · Knowledge of CI/CD tools and principles applied to ML systems. · Familiarity with monitoring tools and observability stacks (e.g., Prometheus, Grafana, Azure Monitor). · Strong scripting skills in Python · Deep understanding of ML lifecycle challenges including model versioning, rollback, and scaling. · Excellent communication skills and ability to collaborate across technical and non-technical teams. --- Nice to Have: · Experience with Azure DevOps or GitHub Actions for ML CI/CD. · Exposure to model performance optimization and A/B testing in production environments. · Familiarity with feature stores and online inference frameworks. · Knowledge of data governance and ML compliance frameworks. · Experience with ML libraries like scikit-learn, PyTorch, or TensorFlow. --- Education: · Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field.

Posted 4 days ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within PWC Responsibilities Job Title: Cloud Engineer (Java 17+, Spring Boot, Microservices, AWS) Job Type: Full-Time Job Overview: As a Cloud Engineer, you will be responsible for developing, deploying, and managing cloud-based applications and services on AWS. You will use your expertise in Java 17+, Spring Boot, and Microservices to build robust and scalable cloud solutions. This role will involve working closely with development teams to ensure seamless cloud integration, optimizing cloud resources, and leveraging AWS tools to ensure high availability, security, and performance. Key Responsibilities: Cloud Infrastructure: Design, build, and deploy cloud-native applications on AWS, utilizing services such as EC2, S3, Lambda, RDS, EKS, API Gateway, and CloudFormation. Backend Development: Develop and maintain backend services and microservices using Java 17+ and Spring Boot, ensuring they are optimized for the cloud environment. Microservices Architecture: Architect and implement microservices-based solutions that are scalable, secure, and resilient, ensuring they align with AWS best practices. CI/CD Pipelines: Set up and manage automated CI/CD pipelines using tools like Jenkins, GitLab CI, or AWS CodePipeline for continuous integration and deployment. AWS Services Integration: Integrate AWS services such as DynamoDB, SQS, SNS, CloudWatch, and Elastic Load Balancing into microservices to improve performance and scalability. Performance Optimization: Monitor and optimize the performance of cloud infrastructure and services, ensuring efficient resource utilization and cost management in AWS. Security: Implement security best practices in cloud applications and services, including IAM roles, VPC configuration, encryption, and authentication mechanisms. Troubleshooting & Support: Provide ongoing support and troubleshooting for cloud-based applications, ensuring uptime, availability, and optimal performance. Collaboration: Work closely with cross-functional teams, including frontend developers, system administrators, and DevOps engineers, to ensure end-to-end solution delivery. Documentation: Document the architecture, implementation, and operations of cloud infrastructure and applications to ensure knowledge sharing and compliance. Required Skills & Qualifications: Strong experience with Java 17+ (latest version) and Spring Boot for backend development. Hands-on experience with AWS Cloud services such as EC2, S3, Lambda, RDS, EKS, API Gateway, DynamoDB, SQS, SNS, and CloudWatch. Proven experience in designing and implementing microservices architectures. Solid understanding of cloud security practices, including IAM, VPC, encryption, and secure cloud-native application development. Experience with CI/CD tools and practices (e.g., Jenkins, GitLab CI, AWS CodePipeline). Familiarity with containerization technologies like Docker, and orchestration tools like Kubernetes. Ability to optimize cloud applications for performance, scalability, and cost-efficiency. Experience with monitoring and logging tools like CloudWatch, ELK Stack, or other AWS-native tools. Knowledge of RESTful APIs and API Gateway for exposing microservices. Solid understanding of version control systems like Git and familiarity with Agile methodologies. Strong problem-solving and troubleshooting skills, with the ability to work in a fast-paced environment. Preferred Skills: AWS certifications, such as AWS Certified Solutions Architect or AWS Certified Developer. Experience with Terraform or AWS CloudFormation for infrastructure as code. Familiarity with Kubernetes and EKS for container orchestration in the cloud. Experience with serverless architectures using AWS Lambda. Knowledge of message queues (e.g., SQS, Kafka) and event-driven architectures. Education & Experience: Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent practical experience. 7-11 years of experience in software development with a focus on AWS cloud and microservices. Mandatory Skill Sets Cloud Engineer (Java+Springboot+ AWS) Preferred Skill Sets Cloud Engineer (Java+Springboot+ AWS) Years Of Experience Required 7-11 years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering, Master of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Cloud Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 33 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply

8.0 years

0 Lacs

Delhi, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Operations Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a business application consulting generalist at PwC, you will provide consulting services for a wide range of business applications. You will leverage a broad understanding of various software solutions to assist clients in optimising operational efficiency through analysis, implementation, training, and support. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are looking for a seasoned JAVA Backend Developer Responsibilities Must Have: - Bachelor’s degree or higher in computer science or related field. - Must have 8+ years of industry experience in related technologies - Strong Computer Science foundation (data structures, algorithms, databases, distributed systems). - Expertise in Java software development is a must have. Minimum Java 8 & Java 11 is preferred. - Strong in spring boot - Ability to develop REST APIs. - General understanding of SQL is needed - General understanding of MongoDB is needed - Experience with AWS - Understanding of container technologies (e.g., Docker, Kubernetes, Cloud Foundry, or Hashicorp Nomad/Consul/Vault). - Practice of modern software engineering including agile methodologies, coding standards, code reviews, source control management, build processes, test automation, and CI/CD pipelines. - Knowledge of moving code from Dev/ Test to Staging and Production. Troubleshoot issues along the CI/CD pipeline. - Working knowledge in Solid project & client - Must have excellent client communication skills Mandatory Skill Sets Should have: 2. - Should have experience in Kafka 3. - Should have experience in Elastic Search 4. - Expertise with one or more programming languages (e.g., Golang, Python or the like), 5. understanding of the concepts, as well as the willingness to share and grow this 6. knowledge is welcomed. 7. - Should have understanding in framework design and modeling, understand the impact 8. of object model design in a large-scale multitenant OnDemand environment. 9. - Proficiency in working with Linux or macOS environments. 10. - Candidate should know basics of react, need not have project experience 11. - Should be able to do minimal bug fixes in the UIExperience in custom plugin creation and maintenance in private npm proxy server. 12. Good to have knowledge of RESTful APIs and Graph QL 13. Good to have knowledge for Api development with Node JS or Spring Boot framework and any relational database management system. 14. Good to have knowledge of Native Mobile Platform (Android/iOS). Preferred Skill Sets JAVA Backend Years Of Experience Required 4+ Education Qualification BE/B.Tech/MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Apache Kafka, ElasticSearch, Python (Programming Language) Optional Skills Java Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description & Summary: We are looking for a skilled Azure Cloud Data Engineer with strong expertise in Python programming , Databricks , and advanced SQL to join our team in Noida . The candidate will be responsible for designing, developing, and optimizing scalable data solutions on the Azure cloud platform. You will play a critical role in building data pipelines and transforming complex data into actionable insights by leveraging cloud-native tools and technologies. Level: Senior Consultant / Manager Location: Noida LOS: Competency: Data & Analytics Skill: Azure Data Engineering Job Position Title: Azure Cloud Data Engineer with Python Programming – Senior Consultant/Manager (6+ Years) Responsibilities: · Design, develop, and manage scalable and secure data pipelines using Azure Databricks and Azure Data Factory. · Write clean, efficient, and reusable code primarily in Python for cloud automation, data processing, and orchestration. · Architect and implement cloud-based data solutions, integrating structured and unstructured data sources. · Build and optimize ETL workflows and ensure seamless data integration across platforms. · Develop data models using normalization and denormalization techniques to support OLTP and OLAP systems. · Manage Azure-based storage solutions including Azure Data Lake and Blob Storage. · Troubleshoot performance bottlenecks in data flows and ETL processes. · Integrate advanced analytics and support BI use cases within the Azure ecosystem. · Lead code reviews and ensure adherence to version control practices (e.g., Git). · Contribute to the design and deployment of enterprise-level data warehousing solutions. · Stay current with Azure cloud technologies and Python ecosystem updates to adopt best practices and emerging tools. Mandatory skill sets: · Strong Python programming skills (Must-Have) – advanced scripting, automation, and cloud SDK experience · Strong SQL skills (Must-Have) · Azure Databricks (Must-Have) · Azure Data Factory · Azure Blob Storage / Azure Data Lake Storage · Apache Spark (hands-on experience) · Data modeling (Normalization & Denormalization) · Data warehousing and BI tools integration · Git (Version Control) · Building scalable ETL pipelines Preferred skill sets (Good to Have): · Understanding of OLTP and OLAP environments · Experience with Kafka and Hadoop · Azure Synapse Analytics · Azure DevOps for CI/CD integration · Agile delivery methodologies Years of experience required: · 6+ years of overall experience in cloud engineering or data engineering roles, with at least 2-3 years of hands-on experience with Azure cloud services. · Proven track record of strong Python development with at least 2-3 years of hands-on experience. Education qualification: BE/B.Tech/MBA/MCA

Posted 4 days ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Solution Architect (India) Work Mode: Remote/ Hybrid Required exp: 10+ years Shift timing: Minimum 4 hours overlap required with US time Role Summary: The Solution Architect is responsible for designing robust, scalable, and high- performance AI and data-driven systems that align with enterprise goals. This role serves as a critical technical leader—bridging AI/ML, data engineering, ETL, cloud architecture, and application development. The ideal candidate will have deep experience across traditional and generative AI, including Retrieval- Augmented Generation (RAG) and agentic AI systems, along with strong fundamentals in data science, modern cloud platforms, and full-stack integration. Key Responsibilities:  Design and own the end-to-end architecture of intelligent systems including data ingestion (ETL/ELT), transformation, storage, modeling, inferencing, and reporting.  Architect GenAI-powered applications using LLMs, vector databases, and RAG pipelines; Agentic Workflow, integrate with enterprise knowledge graphs and document repositories.  Lead the design and deployment of agentic AI systems that can plan, reason, and interact autonomously within business workflows.  Collaborate with cross-functional teams including data scientists, data engineers, MLOps, and frontend/backend developers to deliver scalable and maintainable solutions.  Define patterns and best practices for traditional ML and GenAI projects, covering model governance, explainability, reusability, and lifecycle management.  Ensure seamless integration of ML/AI systems via RESTful APIs with frontend interfaces (e.g., dashboards, portals) and backend systems (e.g., CRMs, ERPs).  Architect multi-cloud or hybrid cloud AI solutions, leveraging services from AWS, Azure, or GCP for scalable compute, storage, orchestration, and deployment.  Provide technical oversight for data pipelines (batch and real-time), data lakes, and ETL frameworks ensuring secure and governed data movement.  Conduct architecture reviews, mentor engineering teams, and drive design standards for AI/ML, data engineering, and software integration. Qualifications :  Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.  10+ years of experience in software architecture, including at least 4 years in AI/ML-focused roles. Required Skills:  Expertise in machine learning (regression, classification, clustering), deep learning (CNNs, RNNs, transformers), and NLP.  Experience with Generative AI frameworks and services (e.g., OpenAI, LangChain, Azure OpenAI, Amazon Bedrock).  Strong hands-on Python skills, with experience in libraries such as Scikit-learn, Pandas, NumPy, TensorFlow, or PyTorch.  Proficiency in RESTful API development and integration with frontend components (React, Angular, or similar is a plus).  Deep experience in ETL/ELT processes using tools like Apache Airflow, Azure Data Factory, or AWS Glue.  Strong knowledge of cloud-native architecture and AI/ML services on either one of the cloud AWS, Azure, or GCP.  Experience with vector databases (e.g., Pinecone, FAISS, Weaviate) and semantic search patterns. Experience in deploying and managing ML models with MLOps frameworks (MLflow, Kubeflow).  Understanding of microservices architecture, API gateways, and container orchestration (Docker, Kubernetes).  Having forntend exp is good to have.

Posted 4 days ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Job Description: Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics Experience in designing and hands-on development in cloud-based analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Thorough understanding of Azure Cloud Infrastructure offerings. Strong experience in common data warehouse modeling principles including Kimball. Working knowledge of Python is desirable Experience developing security models. Databricks & Azure Big Data Architecture Certification would be plus Mandatory Skill Sets ADE, ADB, ADF Preferred Skill Sets ADE, ADB, ADF Years Of Experience Required 4-8 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a business application consulting generalist at PwC, you will provide consulting services for a wide range of business applications. You will leverage a broad understanding of various software solutions to assist clients in optimising operational efficiency through analysis, implementation, training, and support. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within…. Responsibilities Design, develop, and maintain scalable Java applications using Spring Boot and related technologies. - Integrate various analytics services (e.g., Google Analytics, Power BI, Tableau, etc.) into platforms and applications. - Collaborate with cross-functional teams to gather requirements and deliver technical solutions that align with business goals. - Build and enhance products and platforms that support analytics capabilities, ensuring high performance and scalability. - Write efficient, clean, and well-documented code that adheres to best practices. - Develop and integrate RESTful APIs and microservices to support real-time data processing and analytics. - Ensure continuous improvement by actively participating in code reviews and following best practices in development. - Troubleshoot, debug, and resolve application issues and bugs. - Collaborate with DevOps teams to ensure proper deployment and performance of analytics platforms in production environments. - Stay updated with the latest industry trends and advancements in Java, Spring Boot, and analytics tools. ### **Required Qualifications: ** - Experience in Java development, with a strong emphasis on Spring Boot. - Proven experience integrating analytics services (e.g., Google Analytics, Power BI, Tableau) into applications and platforms. - Hands-on experience in building and optimizing products or platforms for analytics and data processing. - Strong understanding of microservices architecture, RESTful APIs, and cloud-based deployment (e.g., AWS,Azure). Proficiency with relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases. - Solid understanding of object-oriented programming, design patterns, and software architecture principles. - Experience with version control tools like Git. - Excellent problem-solving and debugging skills. - Strong communication skills, with the ability to work in a collaborative, fast-paced environment. ### **Preferred Qualifications:** - Experience with front-end technologies like JavaScript, React, or Angular is a plus. - Knowledge of DevOps practices, CI/CD pipelines, and containerization tools (e.g., Docker, Kubernetes). - Familiarity with big data tools and technologies such as Apache Kafka, Hadoop, or Spark. - Experience working in an agile Mandatory skill sets: Java, Spring boot, Kotlin Preferred skill sets: Java, Spring boot, Kotlin Years of experience required: 4-8 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Go Programming Language Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Performance Assessment, Performance Management Software {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities 3+ years of experience in implementing analytical solutions using Palantir Foundry. preferably in PySpark and hyperscaler platforms (cloud services like AWS, GCP and Azure) with focus on building data transformation pipelines at scale. Team management: Must have experience in mentoring and managing large teams (20 to 30 people) for complex engineering programs. Candidate should have experience in hiring and nurturing talent in Palantir Foundry. Training: candidate should have experience in creating training programs in Foundry and delivering the same in a hands-on format either offline or virtually. At least 3 years of hands-on experience of building and managing Ontologies on Palantir Foundry. At least 3 years of experience with Foundry services: Data Engineering with Contour and Fusion Dashboarding, and report development using Quiver (or Reports) Application development using Workshop. Exposure to Map and Vertex is a plus Palantir AIP experience will be a plus Hands-on experience in data engineering and building data pipelines (Code/No Code) for ELT/ETL data migration, data refinement and data quality checks on Palantir Foundry. Hands-on experience of managing data life cycle on at least one hyperscaler platform (AWS, GCP, Azure) using managed services or containerized deployments for data pipelines is necessary. Hands-on experience in working & building on Ontology (esp. demonstrable experience in building Semantic relationships). Proficiency in SQL, Python and PySpark. Demonstrable ability to write & optimize SQL and spark jobs. Some experience in Apache Kafka and Airflow is a prerequisite as well. Hands-on experience on DevOps on hyperscaler platforms and Palantir Foundry is necessary. Experience in MLOps is a plus. Experience in developing and managing scalable architecture & working experience in managing large data sets. Opensource contributions (or own repositories highlighting work) on GitHub or Kaggle is a plus. Experience with Graph data and graph analysis libraries (like Spark GraphX, Python NetworkX etc.) is a plus. A Palantir Foundry Certification (Solution Architect, Data Engineer) is a plus. Certificate should be valid at the time of Interview. Experience in developing GenAI application is a plus Mandatory Skill Sets At least 3 years of hands-on experience of building and managing Ontologies on Palantir Foundry. At least 3 years of experience with Foundry services Preferred Skill Sets Palantir Foundry Years Of Experience Required Experience 4 to 7 years ( 3 + years relevant) Education Qualification Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Science Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Palantir (Software) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Relevant experience as AEM Ops engineer involved in implementation and support of AEM multiple domain sites. Ability in finding the root cause of the issues reported in a complex environment. Installation / configuration / maintenance of AEM Infrastructure with load balanced, replicated and fail-over capabilities. AEM Ops experience for both Cloud (managed services) and on premise. Experience with AEM administration, including user permissions, synchronization, sling, auditing, reporting, and workflows. AEM DEV or DevOps certification is desirable. Exposure to Enterprise Search like Elastic, Apache Solr, Google., Experience with CdN like Akamai. Exposure to Monitoring & Response using tools like AppDynamics, Datadog, DynaTrace, SCOM and Splunk. Experienced in troubleshooting and working closely with Development teams. Dispatcher module configs. Understand and participate in change control and change management processes. Should be able to work independently or with minimum guidance. Mandatory Skill Sets AEM Developer/Operations Preferred Skill Sets Exposure to Monitoring & Response using tools like AppDynamics, Datadog, DynaTrace, SCOM and Splunk Years Of Experience Required 4-7 Years Education Qualification B.Tech/B.E. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Adobe Experience Manager (AEM) Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, Android, API Management, Appian (Platform), Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C#.NET, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Creativity {+ 46 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 4 days ago

Apply

10.0 years

0 Lacs

Chandigarh, India

On-site

Job Description: 7–10 years of industry experience, with at least 5 years in machine learning roles. Advanced proficiency in Python and common ML libraries: TensorFlow, PyTorch, Scikit-learn. Experience with distributed training, model optimization (quantization, pruning), and inference at scale. Hands-on experience with cloud ML platforms: AWS (SageMaker), GCP (Vertex AI), or Azure ML. Familiarity with MLOps tooling: MLflow, TFX, Airflow, or Kubeflow; and data engineering frameworks like Spark, dbt, or Apache Beam. Strong grasp of CI/CD for ML, model governance, and post-deployment monitoring (e.g., data drift, model decay). Excellent problem-solving, communication, and documentation skills.

Posted 4 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities Job Description: Job Summary: We are seeking a talented Data Engineer with strong expertise in Databricks, specifically in Unity Catalog, PySpark, and SQL, to join our data team. You’ll play a key role in building secure, scalable data pipelines and implementing robust data governance strategies using Unity Catalog. Key Responsibilities: Design and implement ETL/ELT pipelines using Databricks and PySpark. Work with Unity Catalog to manage data governance, access controls, lineage, and auditing across data assets. Develop high-performance SQL queries and optimize Spark jobs. Collaborate with data scientists, analysts, and business stakeholders to understand data needs. Ensure data quality and compliance across all stages of the data lifecycle. Implement best practices for data security and lineage within the Databricks ecosystem. Participate in CI/CD, version control, and testing practices for data pipelines. Required Skills: Proven experience with Databricks and Unity Catalog (data permissions, lineage, audits). Strong hands-on skills with PySpark and Spark SQL. Solid experience writing and optimizing complex SQL queries. Familiarity with Delta Lake, data lakehouse architecture, and data partitioning. Experience with cloud platforms like Azure or AWS. Understanding of data governance, RBAC, and data security standards. Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with tools like Airflow, Git, Azure Data Factory, or dbt. Exposure to streaming data and real-time processing. Knowledge of DevOps practices for data engineering. Mandatory Skill Sets Databricks Preferred Skill Sets Databricks Years Of Experience Required 7-14 years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Bachelor of Technology, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 33 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date August 11, 2025

Posted 4 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the Team: The Data Foundations team plays a critical role in supporting Roku Ads business intelligence and analytics . The team is responsible for developing and managing foundational datasets designed to serve the operational and analytical needs of the broader organization. The team's mission is carried out through three focus areas: acting as the interface between data producers and consumers, simplifying data architecture, and creating tools in a standardized way . About the Role: We are seeking a talented and experienced Senior Software Engineer with a strong background in big data technologies, including Apache Spark and Apache Airflow. This hybrid role bridges software and data engineering, requiring expertise in designing, building, and maintaining scalable systems for both application development and data processing. You will collaborate with cross-functional teams to design and manage robust, production-grade, large-scale data systems. The ideal candidate is a proactive self-starter with a deep understanding of high-scale data services and a commitment to excellence. What you’ll be doing Software Development: Write clean, maintainable, and efficient code, ensuring adherence to best practices through code reviews. Big Data Engineering: Design, develop, and maintain data pipelines and ETL workflows using Apache Spark, Apache Airflow. Optimize data storage, retrieval, and processing systems to ensure reliability, scalability, and performance. Develop and fine-tune complex queries and data processing jobs for large-scale datasets. Monitor, troubleshoot, and improve data systems for minimal downtime and maximum efficiency. Collaboration & Mentorship: Partner with data scientists, software engineers, and other teams to deliver integrated, high-quality solutions. Provide technical guidance and mentorship to junior engineers, promoting best practices in data engineering. We’re excited if you have Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). 5+ years of experience in software and/or data engineering with expertise in big data technologies such as Apache Spark, Apache Airflow and Trino. Strong understanding of SOLID principles and distributed systems architecture. Proven experience in distributed data processing, data warehousing, and real-time data pipelines. Advanced SQL skills, with expertise in query optimization for large datasets. Exceptional problem-solving abilities and the capacity to work independently or collaboratively. Excellent verbal and written communication skills. Experience with cloud platforms such as AWS, GCP, or Azure, and containerization tools like Docker and Kubernetes. (preferred) Familiarity with additional big data technologies, including Hadoop, Kafka, and Presto. (preferred) Strong programming skills in Python, Java, or Scala. (preferred) Knowledge of CI/CD pipelines, DevOps practices, and infrastructure-as-code tools (e.g., Terraform). (preferred) Expertise in data modeling, schema design, and data visualization tools. (preferred) AI literacy and curiosity.You have either tried Gen AI in your previous work or outside of work or are curious about Gen AI and have explored it. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.

Posted 4 days ago

Apply

0.0 - 2.0 years

0 - 0 Lacs

Ahmedabad, Gujarat

On-site

About the Role: We are looking for a skilled and detail-oriented QA Engineer with over 2 years of experience in manual, automation, performance, and security testing . You will work closely with developers, product managers, and DevOps teams to ensure high-quality, secure, and scalable software products. This role is ideal for someone who is passionate about software quality and eager to take ownership of test planning and execution across functional and non-functional requirements. Key Responsibilities: Design and execute test cases for functional, regression, and integration testing. Develop and maintain automated test scripts using tools such as Selenium/TestNG. Conduct performance testing using tools like JMeter, LoadRunner, or similar. Perform basic security testing (e.g., input validation, authentication/authorization checks, session handling). Validate REST APIs and backend logic using tools such as Postman or Swagger. Document defects clearly and follow up with the development team until resolution. Analyze test results, identify patterns, and suggest improvements for stability and performance. Required Skills & Qualifications: Bachelor’s degree in Computer Science, Information Technology, or equivalent. 2+ years of experience in Quality Assurance, with exposure to both manual and automated testing. Hands-on experience in performance testing tools such as Apache JMeter, BlazeMeter, or LoadRunner. Familiarity with security testing concepts , OWASP Top 10, and tools like Burp Suite (basic level). Proficient in bug tracking tools (e.g., Jira). Understanding of API testing using Postman or similar tools. Basic understanding of SQL and database testing. Strong problem-solving, documentation, and communication skills. Job Type: Full-time Pay: ₹30,000.00 - ₹40,000.00 per month Benefits: Leave encashment Paid sick time Paid time off Schedule: Monday to Friday Experience: Performance testing: 2 years (Required) Location: Ahmedabad, Gujarat (Required) Work Location: In person Speak with the employer +91 8160197141

Posted 4 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Position : We are seeking a seasoned Senior Data Architect with deep expertise in Databricks and Microsoft Fabric. This role involves leading the design and implementation of scalable data solutions for BFSI and HLS clients. Role: Senior Data Architect (Databricks & Microsoft Fabric) Location: All PSL Locations Experience: 10-18 Years Job Type: Full Time Employment What You'll Do: Architect and implement scalable, secure, and high-performance data solutions on the Databricks and Azure Fabric. Lead discovery workshops to understand business challenges, data requirements, and current technology ecosystems. Design end-to-end data pipelines, ensuring seamless integration with enterprise systems leveraging Databricks and Microsoft Fabric Optimize databricks and fabric workloads for performance and cost efficiency Provide solutions considering various architectural concerns e.g. Data Governance, Master Data Management, Meta-Data management, Data Quality Management, data security and privacy policies and procedures Optimize solutions for cost efficiency, performance, and reliability. Lead technical engagements, collaborating with client stakeholders and internal teams. Establish and enforce governance, security, and compliance standards within Databricks and Fabric Guide teams in implementing best practices on Databricks and Microsoft Azure. Keeping abreast with latest developments in the industry; evaluating and recommending new and emerging data architectures/patterns, technologies, and standards Act as a subject matter expert (SME) for Databricks and Azure Fabric within the organization and for clients. Delivering and directing pre-sales engagements to prove functional capabilities (POC’s or POV’s) Develop and deliver workshops, webinars, and technical presentations on Databricks and Fabric capabilities. Create white papers, case studies and reusable artifacts to showcase our company’s Databricks value proposition. Build strong relationships with Databricks partnership teams including their product managers and solution architects, contributing to co-marketing and joint go-to-market strategies. Business Development Support: Collaborate with sales and pre-sales teams to provide technical guidance during RFP responses and solutioning. Identify upsell and cross-sell opportunities within existing accounts by showcasing Databricks’s & BI potential for extended use cases. Expertise You'll Bring: Minimum of 10+ years of experience in data architecture, engineering, or analytics roles, with at least 5 years of hands-on experience with Databricks and 1 year of Azure Fabric Proven track record of designing and implementing large-scale data solutions across industries. Experience working in consulting or client-facing roles, particularly with enterprise customers. Deep understanding of modern data architecture principles, including cloud platforms (AWS, Azure, GCP). Deep expertise in modern data architectures, lakehouse principles, and AI-driven analytics. Strong hands-on experience with Databricks core components, including Delta Lake, Apache Spark, MLflow, Unity Catalog, and Databricks Workflows. Understanding of cloud-native services for data ingestion, transformation, and orchestration (e.g., AWS Glue, Azure Data Factory, GCP Dataflow). Exceptional communication and presentation skills, capable of explaining technical concepts to non-technical stakeholders. Strong interpersonal skills to foster collaboration with diverse teams. A self-starter with a growth mindset and the ability to adapt in a fast-paced environment. Databricks Advanced Certification Databricks Certified Data Engineer Professional Certifications in cloud platforms such as: AWS Certified Data Analytics: Specialty Microsoft Certified: Azure Data Engineer Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent “Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind.”

Posted 4 days ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary Experience in applying machine learning techniques, Natural Language Processing or Computer Vision using TensorFlow, Pytorch Strong analytical and problem-solving skills Solid software engineering skills across multiple languages including but not limited to Java or Python, C/C++ Build and deploy end to end ML models and leverage metrics to support predictions, recommendations, search, and growth strategies Deep understanding of ML techniques such as: classification, clustering, deep learning, optimization methods, supervised and unsupervised techniques Proven ability to apply, debug, and develop machine learning models Establish scalable, efficient, automated processes for data analyses, model development, validation and implementation, Choose suitable DL algorithms, software, hardware and suggest integration methods. Ensure AI ML solutions are developed, and validations are performed in accordance with Responsible AI guidelines & Standards To closely monitor the Model Performance and ensure Model Improvements are done post Project Delivery Coach and mentor our team as we build scalable machine learning solutions Strong communication skills and an easy-going attitude Oversee development and implementation of assigned programs and guide teammates Carry out testing procedures to ensure systems are running smoothly Ensure that systems satisfy quality standards and procedures Build and manage strong relationships with stakeholders and various teams internally and externally, Provide direction and structure to assigned projects activities, establishing clear, precise goals, objectives and timeframes, run Project Governance calls with senior Stakeholders Take care of entire prompt life cycle like prompt design, prompt template creation, prompt tuning/optimization for various Gen-AI base models Design and develop prompts suiting project needs Lead and manage team of prompt engineers Stakeholder management across business and domains as required for the projects Evaluating base models and benchmarking performance Implement prompt gaurdrails to prevent attacks like prompt injection, jail braking and prompt leaking Develop, deploy and maintain auto prompt solutions Design and implement minimum design standards for every use case involving prompt engineering Key Responsibilities Strategy As the ML Engineer of AI ML Delivery team, the candidate is expected to solutionise, Develop Models and Integrate pipeline for Delivery of AIML Use cases. Business Understand the Business requirement and execute the ML solutioning and ensue the delivery commitments are delivered on time and schedule. Processes Design and Delivery of AI ML Use cases RAI, Security & Governance Model Validation & Improvements Stakeholder Management People & Talent Manage terms of project assignments and deadlines Work with the team dedicated for models related unstructured and structured data. Risk Management Ownership of the delivery, highlighting various risks on a timely manner to the stakeholders. Identifying proper remediation plan for the risks with proper risk roadmap. Governance Awareness and understanding of the regulatory framework, in which the Group operates, and the regulatory requirements and expectations relevant to the role. Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead the [country / business unit / function/XXX [team] to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] * Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. [Insert local regulator e.g. PRA/FCA prescribed responsibilities and Rationale for allocation]. [Where relevant - Additionally, for subsidiaries or relevant non -subsidiaries] Serve as a Director of the Board of [insert name of entities] Exercise authorities delegated by the Board of Directors and act in accordance with Articles of Association (or equivalent) Key stakeholders Business Stakeholders AIML Engineering Team AIML Product Team Product Enablement Team SCB Infrastructure Team Interfacing Program Team Skills And Experience Use NLP, Vision and ML techniques to bring order to unstructured data Experience in extracting signal from noise in large unstructured datasets a plus Work within the Engineering Team to design, code, train, test, deploy and iterate on enterprise scale machine learning systems Work alongside an excellent, cross-functional team across Engineering, Product and Design create solutions and try various algorithms to solve the problem. Stakeholder Managemen Must Have Hands on experience in Kubernetes and Docker Hands on in Azure Cloud services (VMSS, Blob, AKS, Azure LB) Azure Devops tools CI/CD Hands on in Terraform Good To Have Azure OpenAI Grafana and monitoring Qualifications Masters with specialisation in Technology 8- 12 years relevant of Hands-on Experience Strong proficiency with Python, DJANGO framework and REGEX Good understanding of Machine learning framework Pytorch and Tensorflow Knowledge of Generative AI and RAG Pipeline Good in microservice design pattern and developing scalable application. Ability to build and consume REST API Fine tune and perform code optimization for better performance. Strong understanding on OOP and design thinking Understanding the nature of asynchronous programming and its quirks and workarounds Good understanding of server-side templating languages Understanding accessibility and security compliance, user authentication and authorization between multiple systems, servers, and environments Integration of APIs, multiple data sources and databases into one system Good knowledge in API Gateways and proxies, such as WSO2, KONG, nginx, Apache HTTP Server. Understanding fundamental design principles behind a scalable and distributed application Creating and managing database schemas that represent and support business processes, Hands-on experience in any SQL queries and Database server wrt managing deployment. Implementing automated testing platforms, unit tests, and CICD Pipeline About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.

Posted 4 days ago

Apply

2.0 - 5.0 years

0 Lacs

Mohali district, India

Remote

Job Description: SDE-II – Python Developer Job Title SDE-II – Python Developer Department Operations Location In-Office Employment Type Full-Time Job Summary We are looking for an experienced Python Developer to join our dynamic development team. The ideal candidate will have 2 to 5 years of experience in building scalable backend applications and APIs using modern Python frameworks. This role requires a strong foundation in object-oriented programming, web technologies, and collaborative software development. You will work closely with the design, frontend, and DevOps teams to deliver robust and high-performance solutions. Key Responsibilities • Develop, test, and maintain backend applications using Django, Flask, or FastAPI. • Build RESTful APIs and integrate third-party services to enhance platform capabilities. • Utilize data handling libraries like Pandas and NumPy for efficient data processing. • Write clean, maintainable, and well-documented code that adheres to industry best practices. • Participate in code reviews and mentor junior developers. • Collaborate in Agile teams using Scrum or Kanban workflows. • Troubleshoot and debug production issues with a proactive and analytical approach. Required Qualifications • 2 to 5 years of experience in backend development with Python. • Proficiency in core and advanced Python concepts, including OOP and asynchronous programming. • Strong command over at least one Python framework (Django, Flask, or FastAPI). • Experience with data libraries like Pandas and NumPy. • Understanding of authentication/authorization mechanisms, middleware, and dependency injection. • Familiarity with version control systems like Git. • Comfortable working in Linux environments. Must-Have Skills • Expertise in backend Python development and web frameworks. • Strong debugging, problem-solving, and optimization skills. • Experience with API development and microservices architecture. • Deep understanding of software design principles and security best practices. Good-to-Have Skills • Experience with Generative AI frameworks (e.g., LangChain, Transformers, OpenAI APIs). • Exposure to Machine Learning libraries (e.g., Scikit-learn, TensorFlow, PyTorch). • Knowledge of containerization tools (Docker, Kubernetes). • Familiarity with web servers (e.g., Apache, Nginx) and deployment architectures. • Understanding of asynchronous programming and task queues (e.g., Celery, AsyncIO). • Familiarity with Agile practices and tools like Jira or Trello. • Exposure to CI/CD pipelines and cloud platforms (AWS, GCP, Azure). Company Overview We specialize in delivering cutting-edge solutions in custom software, web, and AI development. Our work culture is a unique blend of in-office and remote collaboration, prioritizing our employees above everything else. At our company, you’ll find an environment where continuous learning, leadership opportunities, and mutual respect thrive. We are proud to foster a culture where individuals are valued, encouraged to evolve, and supported in achieving their fullest potential. Benefits and Perks • Competitive Salary: Earn up to ₹6 –10 LPA based on skills and experience. • Generous Time Off: Benefit from 18 annual holidays to maintain a healthy work-life balance. • Continuous Learning: Access extensive learning opportunities while working on cutting-edge projects. • Client Exposure: Gain valuable experience in client-facing roles to enhance your professional growth.

Posted 4 days ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities Design and build data pipelines & Data lakes to automate ingestion of structured and unstructured data that provide fast, optimized, and robust end-to-end solutions Knowledge about the concepts of data lake and dat warehouse Experience working with AWS big data technologies Improve the data quality and reliability of data pipelines through monitoring, validation and failure detection. Deploy and configure components to production environments Technology: Redshift, S3, AWS Glue, Lambda, SQL, PySpark, SQL Mandatory Skill Sets AWS Data Engineer Preferred Skill Sets AWS Data Engineer Years Of Experience Required 4-8 Education Qualification B.tech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills AWS Development, Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 4 days ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism SAP Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Utilizing expertise in Power Apps, Power Pages, Power Automate, and Power Virtual Agent development. Designing and creating custom business apps, such as Canvas Apps, SharePoint Form Apps, Model Driven Apps, and Portals/Power Pages Portal. Implementing various Power Automate Flows, including Automated, Instant, Business Process Flow, and UI Flows. Collaborating with backend teams to integrate Power Platform solutions with SQL Server and SPO. Demonstrating strong knowledge of Dataverse, including security and permission levels. Developing and utilizing custom connectors in Power Platform solutions. Creating and consuming functions/API's to retrieve/update data from the database. Managing managed solutions to ensure seamless deployment and version control. Experience in Azure DevOps CI/CD deployment Pipelines. Monitoring and troubleshooting any performance bottlenecks. Having any coding/programming experience is a plus. Excellent communication skills. Requirements 6-9 years of relevant experience. Strong hands-on work experience with Power Pages and Model Driven Apps with Dataverse. Experience in Azure DevOps CI/CD deployment Pipelines. Good communication skills. Mandatory Skill Sets Strong hands-on work experience with Power Pages and Model Driven Apps with Dataverse. Preferred Skill Sets Experience in Azure DevOps CI/CD deployment Pipelines. Years Of Experience Required 5 years to 9 years Education Qualification Bachelor's degree in Computer Science, Engineering, or a related field. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Power Apps Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply

8.0 - 12.0 years

0 Lacs

Goregaon, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Job Description: Key Responsibilities: Designs, implements and maintains reliable and scalable data infrastructure Writes, deploys and maintains software to build, integrate, manage, maintain, and quality-assure data Designs, develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes Works with customers to deploy, manage, and audit standard processes for cloud products Adheres to and advocates for software & data engineering standard processes (e.g. technical design and review, unit testing, monitoring, alerting, source control, code review & documentation) Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain, responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. Part of a cross-disciplinary team working closely with other data engineers, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Job Requirements: Education : Bachelor or higher degree in computer science, Engineering, Information Systems or other quantitative fields Experience : Years of experience: 8 to 12 years relevant experience Deep and hands-on experience designing, planning, productionizing, maintaining and documenting reliable and scalable data infrastructure and data products in complex environments Hands on experience with: Spark for data processing (batch and/or real-time) Configuring Delta Lake on Azure Databricks Languages: SQL, pyspark, python Cloud platforms: Azure Azure Data Factory (must) , Azure Data Lake (must), Azure SQL DB (must), Synapse (must), SQL Pools (must), Databricks (good to have) Designing data solutions in Azure incl. data distributions and partitions, scalability, cost-management, disaster recovery and high availability Azure Devops (or similar tools) for source control & building CI/CD pipelines Experience designing and implementing large-scale distributed systems Customer management and front-ending and ability to lead large organizations through influence Desirable Criteria : Strong customer management- own the delivery for Data track with customer stakeholders Continuous learning and improvement attitude Key Behaviors : Empathetic: Cares about our people, our community and our planet Curious: Seeks to explore and excel Creative: Imagines the extraordinary Inclusive: Brings out the best in each other Mandatory Skill Sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark, spark-SQL Preferred Skill Sets: ‘Good to have’ knowledge, skills and experiences Cosmos DB, Data modeling, Databricks, PowerBI, experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years Of Experience Required: 8 to 12 years relevant experience Education Qualification: BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Bachelor of Technology, Master of Business Administration, Master of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Apache Synapse Optional Skills Microsoft Power Business Intelligence (BI) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Energy Exemplar In an era where the world is rapidly advancing towards a cleaner future through decarbonization, Energy Exemplar’s mission lies in ‘Empowering Transformative Energy Decisions’. Founded in 1999 in Adelaide, Australia, our award-winning software portfolio encompassing the modeling and simulation platform PLEXOS®, Aurora, and Adapt2, is trusted by innovative organizations across the globe. Through our technology and people, we strive to enable stakeholders from across the entire energy value chain to revolutionize the energy ecosystem and to collaboratively plan and execute for a sustainable energy future with unprecedented clarity, speed, and innovation. Our impact is global and is being recognized across the industry. Some of our recent accolades include: SaaS Company of the Year (2025) – Global Business Tech Awards. Environmental Impact Award (2025) – E+E Leaders Awards. IPPAI (Independent Power Producers Association of India) Power Awards (2025) - Winners Finalist: Platts Global Energy Awards (2024) – Grid Edge category Finalist: Reuters Global Energy Transition Awards (2024) – Technologies of Change Top 50 Marketing Team (2024) – Voted by the public at the ICON Awards. How We Work Energy Exemplar is growing fast around 30% year on year and, that growth is driven by how we work. We trust our team to deliver great results from wherever they work best, whether that’s at home, in the office, or on the move. We’re a global team that values ownership, integrity, and innovation. You’ll be supported to balance work and life in a way that works for you, and empowered to take initiative, solve problems, and make an impact, regardless of your background, location, or role. Our four core values, Customer Success, One Global Team, Integrity & Ownership, and Innovation Excellence aren’t just words. They show up in how we collaborate, how we solve, and how we grow together. About the Role Reporting to the Software Engineering Manager as a member of the Development team at IDC, the Principal Software Engineer is responsible for delivering quality and performant software and design to handle the vast array of use cases that our customers have today. This role is responsible for Developing Software Solutions by learning information needs, discussing with managers, studying systems flow, data usage, finding problem areas and coming up with solutions & following the software development lifecycle. Responsibilities • Develop, test and maintain architectures, such as databases and large-scale processing systems using high-performance data pipeline. • Recommend and implement ways to improve data reliability, efficiency, and quality. • Identify performant features and make them universally accessible to our teams across EE. • Work together with data analysts and data scientists to wrangle the data and provide quality datasets and insights to business-critical decisions. • Take end-to-end responsibility for the development, quality, testing, and production readiness of the services you build. • Define and evangelize Data Engineering best standards and practices to ensure engineering excellence at every stage of the development cycle. • Act as a resident expert for data engineering, feature engineering, exploratory data analysis. Qualifications, Skills & Experiences 5-8 years of experience in Python programming, Data structure & algorithm ETL Data pipeline, SQL Experience in developing data pipelines for large-scale, complex datasets from varieties of data sources. Data Engineering expertise with strong experience working with Python, Beautiful Soup, and Web Scraping. Knowledge with cloud-based data technologies such as Azure Data Lake, Azure Data Factory, and Azure Data Bricks is optionally desirable. Moderate coding skills. SQL or similar is required. C# or other languages are strongly preferred. Knowledge of BigData, Data Warehouse , and related technologies. Good understanding of Apache Spark, implementation knowledge with Java or Python. Outstanding communication and collaboration skills. You can learn from and teach others. Strong drive for results. You have a proven record of shepherding experiments to create successful shipping products and services. Energy Exemplar is proud to be an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all team members. We welcome applications from people of all backgrounds, experiences, identities, and abilities. Please let us know if you require accommodations at any stage of the recruitment process—we're here to support you in showcasing your full potential. Energy Exemplar respects your privacy and is committed to protecting the personal data you share during the recruitment process. This Candidate Privacy Notice explains how we collect, use, and protect your personal information when you apply for a role with us.

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies