Jobs
Interviews

10691 Apache Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Junior Full Stack Developer (Java) at Barclays, responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Junior Full Stack Developer (Java) you should have experience with: Proficiency in Java 3+ with programming experience, reading, writing and debugging multi-threaded code, Rest Services. Proven ability to work in a team environment with experience of the full Software Development Lifecycle Demonstrable understanding of Java, J2EE, Spring Framework and JDBC. Working knowledge of Rest Services / Microservices Working knowledge of CI and unit test frameworks. Working knowledge of ORM technologies like Hibernate & Spring Data/JPA Working knowledge of tools like Java Profilers and analyzing memory dumps. Working knowledge of messaging platforms such as MQ and Solace and related design patterns for producing and consuming messages. Working knowledge of XML/JSON and related technologies. Working knowledge of SQL and database technologies such as MS SQL Server, Oracle, Mongo DB Experience working in an AGILE or SCRUM SDLC model Some Other Highly Valued Skills May Include Knowledge of Apache Kafka, Docker, Kubernetes, No SQL – MongoDB, React, Angular Familiar with DevOps fundamentals practices Proven experience of Quality Assurance techniques relevant to application development. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization’s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Analyst Expectations To meet the needs of stakeholders/ customers through specialist advice and support Perform prescribed activities in a timely manner and to a high standard which will impact both the role itself and surrounding roles. Likely to have responsibility for specific processes within a team They may lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. They supervise a team, allocate work requirements and coordinate team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they manage own workload, take responsibility for the implementation of systems and processes within own work area and participate on projects broader than direct team. Execute work requirements as identified in processes and procedures, collaborating with and impacting on the work of closely related teams. Check work of colleagues within team to meet internal and stakeholder requirements. Provide specialist advice and support pertaining to own work area. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how all teams in area contribute to the objectives of the broader sub-function, delivering impact on the work of collaborating teams. Continually develop awareness of the underlying principles and concepts on which the work within the area of responsibility is based, building upon administrative / operational expertise. Make judgements based on practise and previous experience. Assess the validity and applicability of previous or similar experiences and evaluate options under circumstances that are not covered by procedures. Communicate sensitive or difficult information to customers in areas related specifically to customer advice or day to day administrative requirements. Build relationships with stakeholders/ customers to identify and address their needs. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 3 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Enterprise Data & Analytics - Jr Java Developer 3+ years of Java development 1+ Apache Camel Good knowledge of SQL Knowledge of UNIX Good understanding of agile development methodologies Good communication skills Additional experience in following areas would be great to have Kafka, Python, OCP containers, AWS EC2, Snowflake EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 days ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc. Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred Technical And Professional Experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage

Posted 3 days ago

Apply

10.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 3 days ago

Apply

15.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction Joining the IBM Technology Expert Labs teams means you’ll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you’ll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities As a Delivery Consultant, you will work closely with IBM clients and partners to design, deliver, and optimize IBM Technology solutions that align with your clients’ goals. In this role, you will apply your technical expertise to ensure world-class delivery while leveraging your consultative skills such as problem-solving issue- / hypothesis-based methodologies, communication, and service orientation skills. As a member of IBM Technology Expert Labs, a team that is client focused, courageous, pragmatic, and technical, you’ll collaborate with clients to optimize and trailblaze new solutions that address real business challenges. If you are passionate about success with both your career and solving clients’ business challenges, this role is for you. To help achieve this win-win outcome, a ‘day-in-the-life’ of this opportunity may include, but not be limited to… Solving Client Challenges Effectively: Understanding clients’ main challenges and developing solutions that helps them reach true business value by working thru the phases of design, development integration, implementation, migration and product support with a sense of urgency . Agile Planning and Execution: Creating and executing agile plans where you are responsible for installing and provisioning, testing, migrating to production, and day-two operations. Technical Solution Workshops: Conducting and participating in technical solution workshops. Building Effective Relationships: Developing successful relationships at all levels —from engineers to CxOs—with experience of navigating challenging debate to reach healthy resolutions. Self-Motivated Problem Solver: Demonstrating a natural bias towards self-motivation, curiosity, initiative in addition to navigating data and people to find answers and present solutions. Collaboration and Communication: Strong collaboration and communication skills as you work across the client, partner, and IBM team. Preferred Education Bachelor's Degree Required Technical And Professional Expertise In-depth knowledge of the IBM Data & AI portfolio. 15+ years of experience in software services 10+ years of experience in the planning, design, and delivery of one or more products from the IBM Data Integration, IBM Data Intelligence product platforms Experience in designing and implementing solution on IBM Cloud Pak for Data, IBM DataStage Nextgen, Orchestration Pipelines 10+ years’ experience with ETL and database technologies, Experience in architectural planning and implementation for the upgrade/migration of these specific products Experience in designing and implementing Data Quality solutions Experience with installation and administration of these products Excellent understanding of cloud concepts and infrastructure Excellent verbal and written communication skills are essential Preferred Technical And Professional Experience Experience with any of DataStage, Informatica, SAS, Talend products Experience with any of IKC, IGC,Axon Experience with programming languages like Java/Python Experience in AWS, Azure Google or IBM cloud platform Experience with Redhat OpenShift Good to have Knowledge: Apache Spark , Shell scripting, GitHub, JIRA

Posted 3 days ago

Apply

8.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY-Consulting - Data and Analytics – Manager - Data Integration Architect – Medidata Platform Integration EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for an experienced Data Integration Architect with 8+ years in clinical or life sciences domains to lead the integration of Medidata platforms into enterprise clinical trial systems. This role offers the chance to design scalable, compliant data integration solutions, collaborate across global R&D systems, and contribute to data-driven innovation in the healthcare and life sciences space. You will play a key role in aligning integration efforts with organizational architecture and compliance standards while engaging with stakeholders to ensure successful project delivery. Your Key Responsibilities Design and implement scalable integration solutions for large-scale clinical trial systems involving Medidata platforms. Ensure integration solutions comply with regulatory standards such as GxP and CSV. Establish and maintain seamless system-to-system data exchange using middleware platforms (e.g., Apache Kafka, Informatica) or direct API interactions. Collaborate with cross-functional business and IT teams to gather integration requirements and translate them into technical specifications. Align integration strategies with enterprise architecture and data governance frameworks. Provide support to program management through data analysis, integration status reporting, and risk assessment contributions. Interface with global stakeholders to ensure smooth integration delivery and resolve technical challenges. Mentor junior team members and contribute to knowledge sharing and internal learning initiatives. Participate in architectural reviews and provide recommendations for continuous improvement and innovation in integration approaches. Support business development efforts by contributing to solution proposals, proof of concepts (POCs), and client presentations. Skills And Attributes For Success Use a solution-driven approach to design and implement compliant integration strategies for clinical data platforms like Medidata. Strong communication, stakeholder engagement, and documentation skills, with experience presenting complex integration concepts clearly. Proven ability to manage system-to-system data flows using APIs or middleware, ensuring alignment with enterprise architecture and regulatory standards To qualify for the role, you must have Experience: Minimum 8 years in data integration or architecture roles, with a strong preference for experience in clinical research or life sciences domains. Education: Must be a graduate preferrable BE/B.Tech/BCA/Bsc IT Technical Skills: Hands-on expertise in one or more integration platforms such as Apache Kafka, Informatica, or similar middleware technologies; experience in implementing API-based integrations. Domain Knowledge: In-depth understanding of clinical trial data workflows, integration strategies, and regulatory frameworks including GxP and CSV compliance. Soft Skills: Strong analytical thinking, effective communication, and stakeholder management skills with the ability to collaborate across business and technical teams. Additional Attributes: Ability to work independently in a fast-paced environment, lead integration initiatives, and contribute to solution design and architecture discussions. Ideally, you’ll also have Hands-on experience with ETL tools and clinical data pipeline orchestration frameworks. Familiarity with broader clinical R&D platforms such as Oracle Clinical, RAVE, or other EDC systems. Prior experience leading small integration teams and working directly with cross-functional stakeholders in regulated environments What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you About EY As a global leader in assurance, tax, transaction and Consulting services, we’re using the finance products, expertise and systems we’ve developed to build a better working world. That starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. Whenever you join, however long you stay, the exceptional EY experience lasts a lifetime. And with a commitment to hiring and developing the most passionate people, we’ll make our ambition to be the best employer by 2020 a reality. If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible. Join us in building a better working world. Apply now EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 3 days ago

Apply

12.0 years

0 Lacs

Kochi, Kerala, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Job Description Automation Title Data Architect Type of Employment Permanent Overall Years Of Experience 12-15 years Relevant Years Of Experience 10+ Data Architect Data Architect is responsible for designing and implementing data architecture for multiple projects and also build strategies for data governance Position Summary 12 – 15 yrs of experience in a similar profile with strong service delivery background Experience as a Data Architect with a focus on Spark and Data Lake technologies. Experience in Azure Synapse Analytics Proficiency in Apache Spark for large-scale data processing. Expertise in Databricks, Delta Lake, Azure data factory, and other cloud-based data services. Strong understanding of data modeling, ETL processes, and data warehousing principles. Implement a data governance framework with Unity Catalog . Knowledge in designing scalable streaming data pipeline using Azure Event Hub, Azure Stream analytics, Spark streaming Experience with SQL and NoSQL databases, as well as familiarity with big data file formats like Parquet and Avro. Hands on Experience in python and relevant libraries such as pyspark, numpy etc Knowledge of Machine Learning pipelines, GenAI, LLM will be plus Excellent analytical, problem-solving, and technical leadership skills. Experience in integration with business intelligence tools such as Power BI Effective communication and collaboration abilities Excellent interpersonal skills and a collaborative management style Own and delegate responsibilities effectively Ability to analyse and suggest solutions Strong command on verbal and written English language Essential Roles and Responsibilities Work as a Data Architect and able to design and implement data architecture for projects having complex data such as Big Data, Data lakes etc Work with the customers to define strategy for data architecture and data governance Guide the team to implement solutions around data engineering Proactively identify risks and communicate to stakeholders. Develop strategies to mitigate risks Build best practices to enable faster service delivery Build reusable components to reduce cost Build scalable and cost effective architecture EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 3 days ago

Apply

4.0 years

0 Lacs

Andhra Pradesh, India

On-site

Job Title: Data Engineer (4+ Years Experience) Location: Pan India Job Type: Full-Time Experience: 4+ Years Notice Period: Immediate to 30 days preferred Job Summary We are looking for a skilled and motivated Data Engineer with over 4+ years of experience in building and maintaining scalable data pipelines. The ideal candidate will have strong expertise in AWS Redshift and Python/PySpark, with exposure to AWS Glue, Lambda, and ETL tools being a plus. You will play a key role in designing robust data solutions to support analytical and operational needs across the organization. Key Responsibilities Design, develop, and optimize large-scale ETL/ELT data pipelines using PySpark or Python. Implement and manage data models and workflows in AWS Redshift. Work closely with analysts, data scientists, and stakeholders to understand data requirements and deliver reliable solutions. Perform data validation, cleansing, and transformation to ensure high data quality. Build and maintain automation scripts and jobs using Lambda and Glue (if applicable). Ingest, transform, and manage data from various sources into cloud-based data lakes (e.g., S3). Participate in data architecture and platform design discussions. Monitor pipeline performance, troubleshoot issues, and ensure data reliability. Document data workflows, processes, and infrastructure components. Required Skills 4+ years of hands-on experience as a Data Engineer. Strong proficiency in AWS Redshift including schema design, performance tuning, and SQL development. Expertise in Python and PySpark for data manipulation and pipeline development. Experience working with structured and semi-structured data (JSON, Parquet, etc.). Deep knowledge of data warehouse design principles including star/snowflake schemas and dimensional modeling. Good To Have Working knowledge of AWS Glue and building serverless ETL pipelines. Experience with AWS Lambda for lightweight processing and orchestration. Exposure to ETL tools like Informatica, Talend, or Apache Nifi. Familiarity with workflow orchestrators (e.g., Airflow, Step Functions). Knwledge of DevOps practices, version control (Git), and CI/CD pipelines. Preferred Qualifications Bachelor degree in Computer Science, Engineering, or related field. AWS certifications (e.g., AWS Certified Data Analytics, Developer Associate) are a plus.

Posted 3 days ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY-Consulting - Data and Analytics – Manager - Data Integration Architect – Medidata Platform Integration EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for an experienced Data Integration Architect with 8+ years in clinical or life sciences domains to lead the integration of Medidata platforms into enterprise clinical trial systems. This role offers the chance to design scalable, compliant data integration solutions, collaborate across global R&D systems, and contribute to data-driven innovation in the healthcare and life sciences space. You will play a key role in aligning integration efforts with organizational architecture and compliance standards while engaging with stakeholders to ensure successful project delivery. Your Key Responsibilities Design and implement scalable integration solutions for large-scale clinical trial systems involving Medidata platforms. Ensure integration solutions comply with regulatory standards such as GxP and CSV. Establish and maintain seamless system-to-system data exchange using middleware platforms (e.g., Apache Kafka, Informatica) or direct API interactions. Collaborate with cross-functional business and IT teams to gather integration requirements and translate them into technical specifications. Align integration strategies with enterprise architecture and data governance frameworks. Provide support to program management through data analysis, integration status reporting, and risk assessment contributions. Interface with global stakeholders to ensure smooth integration delivery and resolve technical challenges. Mentor junior team members and contribute to knowledge sharing and internal learning initiatives. Participate in architectural reviews and provide recommendations for continuous improvement and innovation in integration approaches. Support business development efforts by contributing to solution proposals, proof of concepts (POCs), and client presentations. Skills And Attributes For Success Use a solution-driven approach to design and implement compliant integration strategies for clinical data platforms like Medidata. Strong communication, stakeholder engagement, and documentation skills, with experience presenting complex integration concepts clearly. Proven ability to manage system-to-system data flows using APIs or middleware, ensuring alignment with enterprise architecture and regulatory standards To qualify for the role, you must have Experience: Minimum 8 years in data integration or architecture roles, with a strong preference for experience in clinical research or life sciences domains. Education: Must be a graduate preferrable BE/B.Tech/BCA/Bsc IT Technical Skills: Hands-on expertise in one or more integration platforms such as Apache Kafka, Informatica, or similar middleware technologies; experience in implementing API-based integrations. Domain Knowledge: In-depth understanding of clinical trial data workflows, integration strategies, and regulatory frameworks including GxP and CSV compliance. Soft Skills: Strong analytical thinking, effective communication, and stakeholder management skills with the ability to collaborate across business and technical teams. Additional Attributes: Ability to work independently in a fast-paced environment, lead integration initiatives, and contribute to solution design and architecture discussions. Ideally, you’ll also have Hands-on experience with ETL tools and clinical data pipeline orchestration frameworks. Familiarity with broader clinical R&D platforms such as Oracle Clinical, RAVE, or other EDC systems. Prior experience leading small integration teams and working directly with cross-functional stakeholders in regulated environments What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you About EY As a global leader in assurance, tax, transaction and Consulting services, we’re using the finance products, expertise and systems we’ve developed to build a better working world. That starts with a culture that believes in giving you the training, opportunities and creative freedom to make things better. Whenever you join, however long you stay, the exceptional EY experience lasts a lifetime. And with a commitment to hiring and developing the most passionate people, we’ll make our ambition to be the best employer by 2020 a reality. If you can confidently demonstrate that you meet the criteria above, please contact us as soon as possible. Join us in building a better working world. Apply now EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 3 days ago

Apply

12.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Job Description Automation Title Data Architect Type of Employment Permanent Overall Years Of Experience 12-15 years Relevant Years Of Experience 10+ Data Architect Data Architect is responsible for designing and implementing data architecture for multiple projects and also build strategies for data governance Position Summary 12 – 15 yrs of experience in a similar profile with strong service delivery background Experience as a Data Architect with a focus on Spark and Data Lake technologies. Experience in Azure Synapse Analytics Proficiency in Apache Spark for large-scale data processing. Expertise in Databricks, Delta Lake, Azure data factory, and other cloud-based data services. Strong understanding of data modeling, ETL processes, and data warehousing principles. Implement a data governance framework with Unity Catalog . Knowledge in designing scalable streaming data pipeline using Azure Event Hub, Azure Stream analytics, Spark streaming Experience with SQL and NoSQL databases, as well as familiarity with big data file formats like Parquet and Avro. Hands on Experience in python and relevant libraries such as pyspark, numpy etc Knowledge of Machine Learning pipelines, GenAI, LLM will be plus Excellent analytical, problem-solving, and technical leadership skills. Experience in integration with business intelligence tools such as Power BI Effective communication and collaboration abilities Excellent interpersonal skills and a collaborative management style Own and delegate responsibilities effectively Ability to analyse and suggest solutions Strong command on verbal and written English language Essential Roles and Responsibilities Work as a Data Architect and able to design and implement data architecture for projects having complex data such as Big Data, Data lakes etc Work with the customers to define strategy for data architecture and data governance Guide the team to implement solutions around data engineering Proactively identify risks and communicate to stakeholders. Develop strategies to mitigate risks Build best practices to enable faster service delivery Build reusable components to reduce cost Build scalable and cost effective architecture EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 3 days ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 3 days ago

Apply

0.0 - 15.0 years

83 - 104 Lacs

Delhi, Delhi

On-site

Job Title: Data Architect (Leadership Role) Company : Wingify Location : Delhi (Outstation Candidates Allowed) Experience Required : 10 – 15 years Working Days : 5 days/week Budget : 83 Lakh to 1.04 Cr About Us We are a fast-growing product-based tech company known for its flagship product VWO—a widely adopted A/B testing platform used by over 4,000 businesses globally, including Target, Disney, Sears, and Tinkoff Bank. The team is self-organizing, highly creative, and passionate about data, tech, and continuous innovation. About us Company Size: Mid-Sized Industry : Consumer Internet, Technology, Consulting Role & Responsibilities Lead and mentor a team of Data Engineers, ensuring performance and career development. Architect scalable and reliable data infrastructure with high availability. Define and implement data governance frameworks, compliance, and best practices. Collaborate cross-functionally to execute the organization’s data roadmap. Optimize data processing workflows for scalability and cost efficiency. Ensure data quality, privacy, and security across platforms. Drive innovation and technical excellence across the data engineering function. Ideal Candidate Must-Haves Experience : 10+ years in software/data engineering roles. At least 2–3+ years in a leadership role managing teams of 5+ Data Engineers. Proven hands-on experience setting up data engineering systems from scratch (0 → 1 stage) in high-growth B2B product companies. Technical Expertise: Strong in Java (preferred), or Python, Node.js, GoLang. Expertise in big data tools: Apache Spark, Kafka, Hadoop, Hive, Airflow, Presto, HDFS. Strong design experience in High-Level Design (HLD) and Low-Level Design (LLD). Backend frameworks like Spring Boot, Google Guice. Cloud data platforms: AWS, GCP, Azure. Familiarity with data warehousing: Snowflake, Redshift, BigQuery. Databases: Redis, Cassandra, MongoDB, TiDB. DevOps tools: Jenkins, Docker, Kubernetes, Ansible, Chef, Grafana, ELK. Other Skills: Strong understanding of data governance, security, and compliance (GDPR, SOC2, etc.). Proven strategic thinking with ability to align technical architecture to business objectives. Excellent communication, leadership, and stakeholder management. Preferred Qualifications Exposure to Machine Learning infrastructure / MLOps. Experience with real-time data analytics. Strong foundation in algorithms, data structures, and scalable systems. Previous work in SaaS or high-growth startups. Screening Questions Do you have team leadership experience? How many engineers have you led? Have you built a data engineering platform from scratch? Describe the setup. What’s the largest data scale you’ve worked with and where? Are you open to continuing hands-on coding in this role? Interested candidates applies on deepak.visko@gmail.com or 9238142824 . Job Types: Full-time, Permanent Pay: ₹8,300,000.00 - ₹10,400,000.00 per year Work Location: In person

Posted 3 days ago

Apply

4.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

About Media.net : Media.net is a leading, global ad tech company that focuses on creating the most transparent and efficient path for advertiser budgets to become publisher revenue. Our proprietary contextual technology is at the forefront of enhancing Programmatic buying, the latest industry standard in ad buying for digital platforms. The Media.net platform powers major global publishers and ad-tech businesses at scale across ad formats like display, video, mobile, native, as well as search. Media.net’s U.S. HQ is based in New York, and the Global HQ is in Dubai. With office locations and consultant partners across the world, Media.net takes pride in the value-add it offers to its 50+ demand and 21K+ publisher partners, in terms of both products and services. Responsibilities (What You’ll Do) Infrastructure Management: Oversee and maintain the infrastructure that supports the ad exchange applications. This includes load balancers, data stores, CI/CD pipelines, and monitoring stacks. Continuously improve infrastructure resilience, scalability, and efficiency to meet the demands of massive request volume and stringent latency requirements. Developing policies and procedures that improve overall platform stability and participate in shared On-call schedule Collaboration with Developers: Work closely with developers to establish and uphold quality and performance benchmarks, ensuring that applications meet necessary criteria before they are deployed to production. Participate in design reviews and provide feedback on infrastructure-related aspects to improve system performance and reliability. Building Tools for Infra Management: Develop tools to simplify and enhance infrastructure management, automate processes, and improve operational efficiency. These tools may address areas such as monitoring, alerting, deployment automation, and failure detection and recovery, which are critical in minimizing latency and maintaining uptime. Performance Optimization: Focus on reducing latency and maximizing efficiency across all components, from request handling in load balancers to database optimization. Implement best practices and tools for performance monitoring, including real-time analysis and response mechanisms. Who Should Apply B.Tech/M.Tech or equivalent in Computer Science, Information Technology, or a related field. 2–4 years of experience managing services in large-scale distributed systems. Strong understanding of networking concepts (e.g., TCP/IP, routing, SDN) and modern software architectures. Proficiency in programming and scripting languages such as Python, Go, or Ruby, with a focus on automation. Experience with container orchestration tools like Kubernetes and virtualization platforms (preferably GCP). Ability to independently own problem statements, manage priorities, and drive solutions. Preferred Skills & Tools Expertise: Infrastructure as Code: Experience with Terraform. Configuration management tools like Nix, Ansible. Monitoring and Logging Tools: Expertise with Prometheus, Grafana, or ELK stack. OLAP databases : Clickhouse and Apache druid. CI/CD Pipelines: Hands-on experience with Jenkins, or ArgoCD. Databases: Proficiency in MySQL (relational) or Redis (NoSQL). Load Balancers Servers: Familiarity with haproxy or Nginx. Strong knowledge of operating systems and networking fundamentals. Experience with version control systems such as Git.

Posted 3 days ago

Apply

0 years

0 Lacs

India

On-site

Job Description: We are seeking a highly skilled 4+ Azure Data Engineer to design, develop, and optimize data pipelines and data integration solutions in a cloud-based environment. The ideal candidate will have strong technical expertise in Azure, Data Engineering tools, and advanced ETL design along with excellent communication and problem-solving skills. Key Responsibilities: Design and develop advanced ETL pipelines for data ingestion and egress for batch data. Build scalable data solutions using Azure Data Factory (ADF) , Databricks , Spark (PySpark & Scala Spark) , and other Azure services. Troubleshoot data jobs, identify issues, and implement effective root cause solutions. Collaborate with stakeholders to gather requirements and propose efficient solution designs. Ensure data quality, reliability, and adherence to best practices in data engineering. Maintain detailed documentation of problem definitions, solutions, and architecture. Work independently with minimal supervision while ensuring project deadlines are met. Required Skills & Qualifications: Microsoft Certified: Azure Fundamentals (preferred). Microsoft Certified: Azure Data Engineer Associate (preferred). Proficiency in SQL , Python , and Scala . Strong knowledge of Azure Cloud services , ADF , and Databricks . Hands-on experience with Apache Spark (PySpark & Scala Spark). Expertise in designing and implementing complex ETL pipelines for batch data. Strong troubleshooting skills with the ability to perform root cause analysis. Soft Skills: Excellent verbal and written communication skills. Strong documentation skills for drafting problem definitions and solutions. Ability to effectively gather requirements and propose solution designs. Self-driven with the ability to work independently with minimal supervision.

Posted 3 days ago

Apply

1.0 - 2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

SLSQ326R415 Databricks is at the forefront of the Unified Data Analytics field, where innovation is key to providing our clients with a competitive edge in today's fast-paced business landscape. We are looking for a Business Development Representative to help drive revenue growth within the India Market. If you're a results-oriented sales professional with a track record in similar roles, aiming to contribute to the expansion of a transformative enterprise software company and propel your career, this role is for you. Reporting to the Manager of the India Sales Development team, you'll play a pivotal role in this journey. The Impact You Will Have Cultivate expertise in value-based selling, big data, and AI. Evaluate and prioritize the inbound leads from Marketing initiatives. Craft outbound strategies encompassing personalized emails, cold calls, and social selling to qualify opportunities. Devise compelling outreach campaigns targeting diverse buyer levels, including senior executives, to unlock opportunities in critical target accounts. Identify and uncover client requirements, progressing discussions into sales prospects by demonstrating how Databricks can address their data-related challenges. What We Look For Preferably a minimum of 1-2 years of prior experience in inbound and outbound sales and inquiries. Proficiency in comprehending technical concepts, coupled with genuine enthusiasm for technology. Determination and courage to excel and contribute to the growth of the next top-tier enterprise software company. Demonstrated a history of consistent, quantifiable achievements in previous roles. Curiosity and eagerness to continually learn and stay abreast of developments in the big data/AI sector. A strong sense of ownership and accountability. About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.

Posted 3 days ago

Apply

4.0 years

0 Lacs

Gwalior, Madhya Pradesh, India

On-site

*Lead and manage a team of developers, providing guidance, code reviews, and mentorship. *Architect, design, develop, and maintain web applications using PHP and relevant frameworks. *Collaborate with project managers, designers, and QA to deliver high-quality products on time. *Manage server configurations, deployments, backups, and ensure application security and performance. *Set coding standards and best practices, ensuring code quality and re-usability. *Troubleshoot and debug existing applications and identify areas for improvement. *Handle version control using Git and manage CI/CD pipelines.Monitor server health and ensure 24/7 uptime for critical web applications. Required Skills & Qualifications *Bachelor’s degree in Computer Science or related field (or equivalent experience). *4+ years of experience in PHP development and at least 2 years in a team lead or senior role. *Proficient in PHP, MySQL, and modern frameworks like Laravel, CodeIgniter, Symfony, Opencart etc. *Strong experience with REST APIs, AJAX, and third-party integrations.*Good knowledge of front-end technologies like HTML5, CSS3, JavaScript, jQuery, and Bootstrap. *Experience in server management (Linux, Apache/Nginx, VPS, cPanel, SSL, firewalls, etc.). *Knowledge about Domain Book, Renew and DNS Activities.*Familiarity with cloud platforms like AWS, Digital Ocean, or similar is a plus. *Proficient in Git, version control systems, and deployment tools. *Strong problem-solving skills and ability to work independently or in a team environment.

Posted 3 days ago

Apply

8.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Role Description We are looking for a seasoned Senior Backend Engineer with 8+ years of experience in Java-based backend development. The ideal candidate will have deep expertise in Java 11+, Spring Boot (v2.7+), and strong database fundamentals using Oracle DB. You will be instrumental in building scalable and maintainable backend systems and RESTful APIs. Candidates with additional experience in messaging systems like RabbitMQ, Apache ActiveMQ, or Azure Service Bus, and knowledge of integration patterns are preferred. Key Responsibilities Design, develop, and maintain robust backend services using Java 11+, Spring Boot 2.7+, and Oracle DB. Work with JDBC, Hibernate, HQL, and SQL to manage database transactions and data persistence. Build and consume RESTful APIs using JSON, XML, and YAML formats. Collaborate with cross-functional teams including QA, DevOps, and Product to deliver high-quality software. Utilize tools such as Git, Bitbucket, SourceTree, and Jenkins for source control and CI/CD workflows. Write clean, testable code using modern development practices and tools like IntelliJ or Eclipse IDE. Participate in Agile ceremonies, contribute to sprint planning, and document knowledge in JIRA and Confluence. Leverage Maven and Gradle for build automation and Tomcat for application deployment. Perform unit, integration, and API testing using tools such as Postman. Required Skills & Qualifications 8+ years of backend development experience using Java (version 11 or above). Strong hands-on experience with Spring Boot (v2.7 or higher). Proficiency with Oracle Database fundamentals, JDBC, Hibernate, and SQL/HQL. Familiarity with JSON, XML, and YAML data formats. Solid understanding of REST API design and development. Experience with Git, Bitbucket, Git Bash, and SourceTree. Practical knowledge of CI/CD tools including Jenkins, Maven, and Gradle. Experience working with IDEs such as IntelliJ IDEA or Eclipse. Familiarity with application servers like Apache Tomcat. Strong problem-solving skills and the ability to work in a fast-paced Agile environment. Salary: 90K-100K per month

Posted 3 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Hi Connections, Urgent - Hiring for below role About the Role: We are seeking a seasoned and highly skilled MLOps Engineer to join our growing team. The ideal candidate will have extensive hands-on experience with deploying, monitoring, and retraining machine learning models in production environments. You will be responsible for building and maintaining robust and scalable MLOps pipelines using tools like MLflow, Apache Airflow, Kubernetes, and Databricks or Azure ML. A strong understanding of infrastructure-as-code using Terraform is essential. You will play a key role in operationalizing AI/ML systems and ensuring high performance, availability, and automation across the ML lifecycle. --- Key Responsibilities: · Design and implement scalable MLOps pipelines for model training, validation, deployment, and monitoring. · Operationalize machine learning models using MLflow, Airflow, and containerized deployments via Kubernetes. · Automate and manage ML workflows across cloud platforms such as Azure ML or Databricks. · Develop infrastructure using Terraform for consistent and repeatable deployments. · Trace API calls to LLMs, Azure OCR and Paradigm · Implement performance monitoring, alerting, and logging for deployed models using custom and third-party tools. · Automate model retraining and continuous deployment pipelines based on data drift and model performance metrics. · Ensure traceability, reproducibility, and auditability of ML experiments and deployments. · Collaborate with Data Scientists, ML Engineers, and DevOps teams to streamline ML workflows. · Apply CI/CD practices and version control to the entire ML lifecycle. · Ensure secure, reliable, and compliant deployment of models in production environments. --- Required Qualifications: · 5+ years of experience in MLOps, DevOps, or ML engineering roles, with a focus on production ML systems. · Proven experience deploying machine learning models using MLflow and workflow orchestration with Apache Airflow. · Hands-on experience with Kubernetes for container orchestration in ML deployments. · Proficiency with Databricks and/or Azure ML, including model training and deployment capabilities. · Solid understanding and practical experience with Terraform for infrastructure-as-code. · Experience automating model monitoring and retraining processes based on data and model drift. · Knowledge of CI/CD tools and principles applied to ML systems. · Familiarity with monitoring tools and observability stacks (e.g., Prometheus, Grafana, Azure Monitor). · Strong scripting skills in Python · Deep understanding of ML lifecycle challenges including model versioning, rollback, and scaling. · Excellent communication skills and ability to collaborate across technical and non-technical teams. --- Nice to Have: · Experience with Azure DevOps or GitHub Actions for ML CI/CD. · Exposure to model performance optimization and A/B testing in production environments. · Familiarity with feature stores and online inference frameworks. · Knowledge of data governance and ML compliance frameworks. · Experience with ML libraries like scikit-learn, PyTorch, or TensorFlow. --- Education: · Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field.

Posted 3 days ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within PWC Responsibilities Job Title: Cloud Engineer (Java 17+, Spring Boot, Microservices, AWS) Job Type: Full-Time Job Overview: As a Cloud Engineer, you will be responsible for developing, deploying, and managing cloud-based applications and services on AWS. You will use your expertise in Java 17+, Spring Boot, and Microservices to build robust and scalable cloud solutions. This role will involve working closely with development teams to ensure seamless cloud integration, optimizing cloud resources, and leveraging AWS tools to ensure high availability, security, and performance. Key Responsibilities: Cloud Infrastructure: Design, build, and deploy cloud-native applications on AWS, utilizing services such as EC2, S3, Lambda, RDS, EKS, API Gateway, and CloudFormation. Backend Development: Develop and maintain backend services and microservices using Java 17+ and Spring Boot, ensuring they are optimized for the cloud environment. Microservices Architecture: Architect and implement microservices-based solutions that are scalable, secure, and resilient, ensuring they align with AWS best practices. CI/CD Pipelines: Set up and manage automated CI/CD pipelines using tools like Jenkins, GitLab CI, or AWS CodePipeline for continuous integration and deployment. AWS Services Integration: Integrate AWS services such as DynamoDB, SQS, SNS, CloudWatch, and Elastic Load Balancing into microservices to improve performance and scalability. Performance Optimization: Monitor and optimize the performance of cloud infrastructure and services, ensuring efficient resource utilization and cost management in AWS. Security: Implement security best practices in cloud applications and services, including IAM roles, VPC configuration, encryption, and authentication mechanisms. Troubleshooting & Support: Provide ongoing support and troubleshooting for cloud-based applications, ensuring uptime, availability, and optimal performance. Collaboration: Work closely with cross-functional teams, including frontend developers, system administrators, and DevOps engineers, to ensure end-to-end solution delivery. Documentation: Document the architecture, implementation, and operations of cloud infrastructure and applications to ensure knowledge sharing and compliance. Required Skills & Qualifications: Strong experience with Java 17+ (latest version) and Spring Boot for backend development. Hands-on experience with AWS Cloud services such as EC2, S3, Lambda, RDS, EKS, API Gateway, DynamoDB, SQS, SNS, and CloudWatch. Proven experience in designing and implementing microservices architectures. Solid understanding of cloud security practices, including IAM, VPC, encryption, and secure cloud-native application development. Experience with CI/CD tools and practices (e.g., Jenkins, GitLab CI, AWS CodePipeline). Familiarity with containerization technologies like Docker, and orchestration tools like Kubernetes. Ability to optimize cloud applications for performance, scalability, and cost-efficiency. Experience with monitoring and logging tools like CloudWatch, ELK Stack, or other AWS-native tools. Knowledge of RESTful APIs and API Gateway for exposing microservices. Solid understanding of version control systems like Git and familiarity with Agile methodologies. Strong problem-solving and troubleshooting skills, with the ability to work in a fast-paced environment. Preferred Skills: AWS certifications, such as AWS Certified Solutions Architect or AWS Certified Developer. Experience with Terraform or AWS CloudFormation for infrastructure as code. Familiarity with Kubernetes and EKS for container orchestration in the cloud. Experience with serverless architectures using AWS Lambda. Knowledge of message queues (e.g., SQS, Kafka) and event-driven architectures. Education & Experience: Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent practical experience. 7-11 years of experience in software development with a focus on AWS cloud and microservices. Mandatory Skill Sets Cloud Engineer (Java+Springboot+ AWS) Preferred Skill Sets Cloud Engineer (Java+Springboot+ AWS) Years Of Experience Required 7-11 years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering, Master of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Cloud Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 33 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

8.0 years

0 Lacs

Delhi, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Operations Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a business application consulting generalist at PwC, you will provide consulting services for a wide range of business applications. You will leverage a broad understanding of various software solutions to assist clients in optimising operational efficiency through analysis, implementation, training, and support. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are looking for a seasoned JAVA Backend Developer Responsibilities Must Have: - Bachelor’s degree or higher in computer science or related field. - Must have 8+ years of industry experience in related technologies - Strong Computer Science foundation (data structures, algorithms, databases, distributed systems). - Expertise in Java software development is a must have. Minimum Java 8 & Java 11 is preferred. - Strong in spring boot - Ability to develop REST APIs. - General understanding of SQL is needed - General understanding of MongoDB is needed - Experience with AWS - Understanding of container technologies (e.g., Docker, Kubernetes, Cloud Foundry, or Hashicorp Nomad/Consul/Vault). - Practice of modern software engineering including agile methodologies, coding standards, code reviews, source control management, build processes, test automation, and CI/CD pipelines. - Knowledge of moving code from Dev/ Test to Staging and Production. Troubleshoot issues along the CI/CD pipeline. - Working knowledge in Solid project & client - Must have excellent client communication skills Mandatory Skill Sets Should have: 2. - Should have experience in Kafka 3. - Should have experience in Elastic Search 4. - Expertise with one or more programming languages (e.g., Golang, Python or the like), 5. understanding of the concepts, as well as the willingness to share and grow this 6. knowledge is welcomed. 7. - Should have understanding in framework design and modeling, understand the impact 8. of object model design in a large-scale multitenant OnDemand environment. 9. - Proficiency in working with Linux or macOS environments. 10. - Candidate should know basics of react, need not have project experience 11. - Should be able to do minimal bug fixes in the UIExperience in custom plugin creation and maintenance in private npm proxy server. 12. Good to have knowledge of RESTful APIs and Graph QL 13. Good to have knowledge for Api development with Node JS or Spring Boot framework and any relational database management system. 14. Good to have knowledge of Native Mobile Platform (Android/iOS). Preferred Skill Sets JAVA Backend Years Of Experience Required 4+ Education Qualification BE/B.Tech/MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Apache Kafka, ElasticSearch, Python (Programming Language) Optional Skills Java Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description & Summary: We are looking for a skilled Azure Cloud Data Engineer with strong expertise in Python programming , Databricks , and advanced SQL to join our team in Noida . The candidate will be responsible for designing, developing, and optimizing scalable data solutions on the Azure cloud platform. You will play a critical role in building data pipelines and transforming complex data into actionable insights by leveraging cloud-native tools and technologies. Level: Senior Consultant / Manager Location: Noida LOS: Competency: Data & Analytics Skill: Azure Data Engineering Job Position Title: Azure Cloud Data Engineer with Python Programming – Senior Consultant/Manager (6+ Years) Responsibilities: · Design, develop, and manage scalable and secure data pipelines using Azure Databricks and Azure Data Factory. · Write clean, efficient, and reusable code primarily in Python for cloud automation, data processing, and orchestration. · Architect and implement cloud-based data solutions, integrating structured and unstructured data sources. · Build and optimize ETL workflows and ensure seamless data integration across platforms. · Develop data models using normalization and denormalization techniques to support OLTP and OLAP systems. · Manage Azure-based storage solutions including Azure Data Lake and Blob Storage. · Troubleshoot performance bottlenecks in data flows and ETL processes. · Integrate advanced analytics and support BI use cases within the Azure ecosystem. · Lead code reviews and ensure adherence to version control practices (e.g., Git). · Contribute to the design and deployment of enterprise-level data warehousing solutions. · Stay current with Azure cloud technologies and Python ecosystem updates to adopt best practices and emerging tools. Mandatory skill sets: · Strong Python programming skills (Must-Have) – advanced scripting, automation, and cloud SDK experience · Strong SQL skills (Must-Have) · Azure Databricks (Must-Have) · Azure Data Factory · Azure Blob Storage / Azure Data Lake Storage · Apache Spark (hands-on experience) · Data modeling (Normalization & Denormalization) · Data warehousing and BI tools integration · Git (Version Control) · Building scalable ETL pipelines Preferred skill sets (Good to Have): · Understanding of OLTP and OLAP environments · Experience with Kafka and Hadoop · Azure Synapse Analytics · Azure DevOps for CI/CD integration · Agile delivery methodologies Years of experience required: · 6+ years of overall experience in cloud engineering or data engineering roles, with at least 2-3 years of hands-on experience with Azure cloud services. · Proven track record of strong Python development with at least 2-3 years of hands-on experience. Education qualification: BE/B.Tech/MBA/MCA

Posted 3 days ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Solution Architect (India) Work Mode: Remote/ Hybrid Required exp: 10+ years Shift timing: Minimum 4 hours overlap required with US time Role Summary: The Solution Architect is responsible for designing robust, scalable, and high- performance AI and data-driven systems that align with enterprise goals. This role serves as a critical technical leader—bridging AI/ML, data engineering, ETL, cloud architecture, and application development. The ideal candidate will have deep experience across traditional and generative AI, including Retrieval- Augmented Generation (RAG) and agentic AI systems, along with strong fundamentals in data science, modern cloud platforms, and full-stack integration. Key Responsibilities:  Design and own the end-to-end architecture of intelligent systems including data ingestion (ETL/ELT), transformation, storage, modeling, inferencing, and reporting.  Architect GenAI-powered applications using LLMs, vector databases, and RAG pipelines; Agentic Workflow, integrate with enterprise knowledge graphs and document repositories.  Lead the design and deployment of agentic AI systems that can plan, reason, and interact autonomously within business workflows.  Collaborate with cross-functional teams including data scientists, data engineers, MLOps, and frontend/backend developers to deliver scalable and maintainable solutions.  Define patterns and best practices for traditional ML and GenAI projects, covering model governance, explainability, reusability, and lifecycle management.  Ensure seamless integration of ML/AI systems via RESTful APIs with frontend interfaces (e.g., dashboards, portals) and backend systems (e.g., CRMs, ERPs).  Architect multi-cloud or hybrid cloud AI solutions, leveraging services from AWS, Azure, or GCP for scalable compute, storage, orchestration, and deployment.  Provide technical oversight for data pipelines (batch and real-time), data lakes, and ETL frameworks ensuring secure and governed data movement.  Conduct architecture reviews, mentor engineering teams, and drive design standards for AI/ML, data engineering, and software integration. Qualifications :  Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.  10+ years of experience in software architecture, including at least 4 years in AI/ML-focused roles. Required Skills:  Expertise in machine learning (regression, classification, clustering), deep learning (CNNs, RNNs, transformers), and NLP.  Experience with Generative AI frameworks and services (e.g., OpenAI, LangChain, Azure OpenAI, Amazon Bedrock).  Strong hands-on Python skills, with experience in libraries such as Scikit-learn, Pandas, NumPy, TensorFlow, or PyTorch.  Proficiency in RESTful API development and integration with frontend components (React, Angular, or similar is a plus).  Deep experience in ETL/ELT processes using tools like Apache Airflow, Azure Data Factory, or AWS Glue.  Strong knowledge of cloud-native architecture and AI/ML services on either one of the cloud AWS, Azure, or GCP.  Experience with vector databases (e.g., Pinecone, FAISS, Weaviate) and semantic search patterns. Experience in deploying and managing ML models with MLOps frameworks (MLflow, Kubeflow).  Understanding of microservices architecture, API gateways, and container orchestration (Docker, Kubernetes).  Having forntend exp is good to have.

Posted 3 days ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Job Description: Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics Experience in designing and hands-on development in cloud-based analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Thorough understanding of Azure Cloud Infrastructure offerings. Strong experience in common data warehouse modeling principles including Kimball. Working knowledge of Python is desirable Experience developing security models. Databricks & Azure Big Data Architecture Certification would be plus Mandatory Skill Sets ADE, ADB, ADF Preferred Skill Sets ADE, ADB, ADF Years Of Experience Required 4-8 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a business application consulting generalist at PwC, you will provide consulting services for a wide range of business applications. You will leverage a broad understanding of various software solutions to assist clients in optimising operational efficiency through analysis, implementation, training, and support. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within…. Responsibilities Design, develop, and maintain scalable Java applications using Spring Boot and related technologies. - Integrate various analytics services (e.g., Google Analytics, Power BI, Tableau, etc.) into platforms and applications. - Collaborate with cross-functional teams to gather requirements and deliver technical solutions that align with business goals. - Build and enhance products and platforms that support analytics capabilities, ensuring high performance and scalability. - Write efficient, clean, and well-documented code that adheres to best practices. - Develop and integrate RESTful APIs and microservices to support real-time data processing and analytics. - Ensure continuous improvement by actively participating in code reviews and following best practices in development. - Troubleshoot, debug, and resolve application issues and bugs. - Collaborate with DevOps teams to ensure proper deployment and performance of analytics platforms in production environments. - Stay updated with the latest industry trends and advancements in Java, Spring Boot, and analytics tools. ### **Required Qualifications: ** - Experience in Java development, with a strong emphasis on Spring Boot. - Proven experience integrating analytics services (e.g., Google Analytics, Power BI, Tableau) into applications and platforms. - Hands-on experience in building and optimizing products or platforms for analytics and data processing. - Strong understanding of microservices architecture, RESTful APIs, and cloud-based deployment (e.g., AWS,Azure). Proficiency with relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases. - Solid understanding of object-oriented programming, design patterns, and software architecture principles. - Experience with version control tools like Git. - Excellent problem-solving and debugging skills. - Strong communication skills, with the ability to work in a collaborative, fast-paced environment. ### **Preferred Qualifications:** - Experience with front-end technologies like JavaScript, React, or Angular is a plus. - Knowledge of DevOps practices, CI/CD pipelines, and containerization tools (e.g., Docker, Kubernetes). - Familiarity with big data tools and technologies such as Apache Kafka, Hadoop, or Spark. - Experience working in an agile Mandatory skill sets: Java, Spring boot, Kotlin Preferred skill sets: Java, Spring boot, Kotlin Years of experience required: 4-8 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Go Programming Language Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Performance Assessment, Performance Management Software {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities 3+ years of experience in implementing analytical solutions using Palantir Foundry. preferably in PySpark and hyperscaler platforms (cloud services like AWS, GCP and Azure) with focus on building data transformation pipelines at scale. Team management: Must have experience in mentoring and managing large teams (20 to 30 people) for complex engineering programs. Candidate should have experience in hiring and nurturing talent in Palantir Foundry. Training: candidate should have experience in creating training programs in Foundry and delivering the same in a hands-on format either offline or virtually. At least 3 years of hands-on experience of building and managing Ontologies on Palantir Foundry. At least 3 years of experience with Foundry services: Data Engineering with Contour and Fusion Dashboarding, and report development using Quiver (or Reports) Application development using Workshop. Exposure to Map and Vertex is a plus Palantir AIP experience will be a plus Hands-on experience in data engineering and building data pipelines (Code/No Code) for ELT/ETL data migration, data refinement and data quality checks on Palantir Foundry. Hands-on experience of managing data life cycle on at least one hyperscaler platform (AWS, GCP, Azure) using managed services or containerized deployments for data pipelines is necessary. Hands-on experience in working & building on Ontology (esp. demonstrable experience in building Semantic relationships). Proficiency in SQL, Python and PySpark. Demonstrable ability to write & optimize SQL and spark jobs. Some experience in Apache Kafka and Airflow is a prerequisite as well. Hands-on experience on DevOps on hyperscaler platforms and Palantir Foundry is necessary. Experience in MLOps is a plus. Experience in developing and managing scalable architecture & working experience in managing large data sets. Opensource contributions (or own repositories highlighting work) on GitHub or Kaggle is a plus. Experience with Graph data and graph analysis libraries (like Spark GraphX, Python NetworkX etc.) is a plus. A Palantir Foundry Certification (Solution Architect, Data Engineer) is a plus. Certificate should be valid at the time of Interview. Experience in developing GenAI application is a plus Mandatory Skill Sets At least 3 years of hands-on experience of building and managing Ontologies on Palantir Foundry. At least 3 years of experience with Foundry services Preferred Skill Sets Palantir Foundry Years Of Experience Required Experience 4 to 7 years ( 3 + years relevant) Education Qualification Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Science Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Palantir (Software) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies