Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist Software Engineer What You Will Do Let’s do this. Let’s change the world. You will play a key role as part of Operations Generative AI (GenAI) Product team to deliver cutting edge innovative GEN AI solutions across various Process Development functions (Drug Substance, Drug Product, Attribute Sciences & Combination Products) in Operations functions. The role involves developing, implementing and sustaining GEN AI solutions to help find relevant, actionable information quickly and accurately. Role Description: The Specialist Software Engineer is responsible for designing, developing, and maintaining GEN AI solutions software applications and solutions that meet business needs and ensure high availability and performance of critical systems and applications in Process development under Operation. This role involves working closely with Data Scientists, business SME’s, and other engineers to create high-quality, scalable GEN AI software solutions to help find relevant, actionable information quickly and accurately, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Take ownership of complex software projects from conception to deployment, Manage software delivery scope, risk, and timeline. Rapidly prototype concepts into working code. Provide technical guidance and mentorship to junior developers. Contribute to front-end and back-end development using cloud technology. Develop innovative solutions using generative AI technologies. Integrate with other systems and platforms to ensure seamless data flow and functionality. Conduct code reviews to ensure code quality and adherence to best practices. Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations. Analyze and understand the functional and technical requirements of applications, solutions, and systems and translate them into software architecture and design specifications. Work closely with product team, cross-functional teams, enterprise technology teams and QA, to deliver high-quality and compliant software on time. Ensure high quality software deliverables free of bugs and performance issues through proper design and comprehensive testing strategies. Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently. Architect and lead the development of scalable, intelligent search systems leveraging NLP, embeddings, LLMs, and vector search Own the end-to-end lifecycle of search solutions, from ingestion and indexing to ranking, relevancy tuning, and UI integration Integrate AI models that improve search precision, query understanding, and result summarization (e.g., generative answers via LLMs). Develop solutions for handling structured/unstructured data in AI pipelines. Partner with platform teams to deploy search solutions on scalable infrastructure (e.g., Kubernetes, Databricks). Experience in integrating Generative AI capabilities and Vision Models to enrich content quality and user engagement. What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree with 4 - 6 years of experience in Computer Science, IT or related field OR Bachelor’s degree with 6 - 8 years of experience in Computer Science, IT or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT or related field Experience in Python, Java, AI/ML based Python libraries(PyTorch), Experienced with Web frameworks like Flask, Django, Fast API Experience with design patterns, data structures, data modelling, data algorithms Familiarity with MLOps, CI/CD for ML, and monitoring of AI models in production. Experienced with AWS /Azure Platform, building and deploying the code Experience in PostgreSQL /Mongo DB SQL database, vector database for large language models, Databricks or RDS, S3 Buckets Experience with popular large language models Experience with Retrieval-augmented generation (RAG) framework, AI Agents, Vector stores, AI/ML platforms, embedding models ex Open AI, Langchain, Redis, pgvector Experience with prompt engineering, model fine tuning Experience with generative AI or retrieval-augmented generation (RAG) frameworks Experience in Agile software development methodologies Experience in End-to-End testing as part of Test-Driven Development Preferred Qualifications: Strong understanding of cloud platforms (e.g., AWS, GCP, Azure) and containerization technologies (e.g., Docker, Kubernetes). Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk). Experience with data processing tools like Hadoop, Spark, or similar. Experience with Langchain or llamaIndex framework for language models; Experience with prompt engineering, model fine-tuning. Experience working on Full stack Applications Professional Certifications: AWS, Data Science Certifications(preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Show more Show less
Posted 1 week ago
15.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Role Summary: Solutioning lead for Data Engineering - AWS and Snowflake as primary stack Role Responsibilities Architecture and Solutioning on AWS and Snowflake platforms - - data warehouse, lakehouse, data fabric and datamesh Sizing ,Estimation and Implementation plan for solutioning Solution Prototyping, Advisory and orchestrating in-person/remote workshops Work with hyperscalers and platform vendors to understand and test platform roadmaps and develop joint solutions Own end-to-end solutions working across various teams in Cognizant - Sales, Delivery and Global solutioning Own key accounts as Architecture advisory and establish deep client relationships Contribute to practice by developing reusable assets and solutions Job Requirements Bachelor’s or Master’s degree in computer science, engineering, information systems or a related field Minimum 15 years’ experience as Solution Architect designing and developing data architecture patterns Minimum 5-year hands-on experience in building AWS & Snowflake based solutions Minimum 3 years’ experience as Solution Architect in pre-sales team driving the sales process from a technical solution standpoint Excellent verbal and written communication skills with ability to present complex Cloud Data Architecture solutions concepts to technical and executive audience (leveraging PPTs, Demos and Whiteboard) Deep expertise in designing AWS and Snowflake Strong expertise in handling large and complex RFPs/RFIs and collaborating with multiple service lines & platform vendors in a fast-paced environment Strong relationship building skills and ability to provide technical advisory and guidance Technology architecture & implementation experience with deep implementation experience with Data solution s 15~20 years of experience in Data Engineering and 5+ Years Data Engineering Experience on cloud data engineering Technology pre sales experience – Architecture, Effort sizing , Estimation and Solution defense Data architecture patterns– Data Warehouse , Data Lake , Data Mesh , Lake house , Data as a product Develop or Co-develop proofs of concept and prototypes with customer teams Excellent understanding of distributed computing fundamentals Experience Working With One Or More Major Cloud Vendors Deep expertise on End to End Pipeline ( or ETL) development following best practices and including orchestration, Optimization of Data pipelines Strong understanding of the full CI/CD lifecycle Large legacy migration ( Hadoop , Terdata like) experience to Cloud Data platforms Expert level proficiency in engineering & optimizing with various data engineering ingestion patterns - Batch, Micro Batch, Streaming and API Understand imperatives of change data capture with tools & best practices POV Architect and Solution Data Governance capability pillars supporting modern data eco system Data services and various consumption archetypes including semantic layers, BI tools and AI&ML Thought leadership designing self-service data engineering platforms & solutions Core Platform – AWS & Snowflake Ability to engage and offer differing points of view to customers architecture using AWS and Snowflake platform Strong understanding of the Snowflake platform including evolving services like Snowpark Implementation expertise using AWS services – EMR , Redshift , Glue , Kinesis , Lambda, AWS Lake formation and Snowflake Security design and implementation on AWS & Snowflake Pipelines development in multi-hop pipeline architecture Architecture and Implementation experience with Spark and Snowflake performance tuning including topics such as cluster sizing Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Your Skills & Experience: Strong expertise in Data Engineering highly recommended. • Overall experience of 4+years of relevant experience in Big Data technologies • Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage. • Strong experience in at least of the programming language Java, Scala, Python. Java preferable • Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc. • Well-versed and working knowledge with data platform related services on Azure/GCP. • Bachelor’s degree and year of work experience of 4+ years or any combination of education, training and/or experience that demonstrates the ability to perform the duties of the position Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Sr. Software Development Engineer (Big Data Engineer) Overview Job Description Summary Mastercard is a technology company in the Global Payments Industry. We operate the world’s fastest payments processing network, connecting consumers, financial institutions, merchants, governments and businesses in more than 210 countries and territories. Mastercard products and solutions make everyday commerce activities – such as shopping, travelling, running a business and managing finances – easier, more secure and more efficient for everyone. Mastercard’s Data & Services team is a key differentiator for MasterCard, providing cutting-edge services that help our customers grow. Focused on thinking big and scaling fast around the globe, this dynamic team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business experimentation, and data-driven information and risk management services. We are currently seeking a Software Development Engineer-II for Location Program within the Data & Services group. You will own end-to-end delivery of engineering projects for some of our analytics and BI solutions that leverage Mastercard dataset combined with proprietary analytics techniques, to help businesses around the world solve multi-million dollar business problems. Roles And Responsibilities Work as a member of support team to resolve issues related to product, should have good troubleshooting skills and good knowledge in support work. Independently apply problem solving skills to identify symptoms and root causes of issues. Make effective and efficient decisions even when data is ambiguous. Provide technical guidance, support and mentoring to more junior team members. Make active contributions to improvement decisions and make technology recommendations that balance business needs and technical requirements. Proactively understand stakeholder needs, goals, expectations and viewpoints, to deliver results. Ensure design thinking accounts for long term maintainability of code. Thrive in a highly collaborative company environment where agility is paramount. Stay up to date with latest technologies and technical advancements through self-study, blogs, meetups, conferences, etc. Perform system maintenance, production incident problem management, identification of root cause & issue remediation. All About You Bachelor's degree in Information Technology, Computer Science or Engineering or equivalent work experience, with a proven track-record of successfully delivering on complex technical assignments. A solid foundation in Computer Science fundamentals, web applications and microservices-based software architecture. Full-stack development experience, including , Databases (Oracle, Netezza, SQL Server), Hands-on experience with Hadoop, Python, Impala, etc,. Excellent SQL skills, with experience working with large and complex data sources and capability of comprehending and writing complex queries. Experience working in Agile teams and conversant with Agile/SAFe tenets and ceremonies. Strong analytical and problem-solving abilities, with quick adaptation to new technologies, methodologies, and systems. Excellent English communication skills (both written and verbal) to effectively interact with multiple technical teams and other stakeholders. High-energy, detail-oriented and proactive, with ability to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-240980 Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
An extraordinarily talented group of individuals work together every day to drive TNS' success, from both professional and personal perspectives. Come join the excellence! Overview The QA area is responsible for the testing of computer programming systems and applications including the design, coding, testing and deployment of products and solutions to meet goals of TNS' products Responsibilities Responsibilities: Develops and executes test strategies/scenarios for Financial Exchange industry applications/infrastructure. Develops and maintains automated regression test suites. Works as part of a scrum team in an agile environment. Works with operations to ensure smooth software deployments in production. Possesses the ability to work with minimal supervision and/or independently. Utilizes experience and judgment to plan and accomplish goals. Writes or modifies scripts in Perl or UNIX to automate repetitive tests. Creates tests and defect reports using the team standards and best practices. Qualifications Experience Required: 5+ years of overall testing experience in developing and automating test cases. Extensive experience with web application test automation using Selenium and TestNG. Experience writing UNIX shell, Python or Perl scripts. Experience in Rest API testing. Experience in Oracle Database and SQL queries. Experience with NoSQL databases like HBase, Druid Load/performance testing experience in web application and Rest API using tools such as JMeter. Experience with Hadoop based framework/application is a plus. Experience in testing web-based applications using Tomcat and JBoss. Experienced in testing of UNIX or Linux client server applications, including testing application installation/configuration and trouble shooting. Experience with build integration tools such as Jenkins. Experience with code coverage tools. Experience with software version control such as GitLab, GIT and/or SVN. Ability to create high level and detailed test plans and test reports. Excellent written and verbal communication skills. Experience Desired Experience with web application security testing. Experience in continuous application integration tools. Experienced in any scripting language is a plus. Education Qualifications BS/MS degree in Engineering, Computer Science, or related technical field of study. If you are passionate about technology, love personal growth and opportunity, come see what TNS is all about! TNS is an equal opportunity employer. TNS evaluates qualified applicants without regard to race, color, religion, gender, national origin, age, sexual orientation, gender identity or expression, protected veteran status, disability/handicap status or any other legally protected characteristic. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
DESCRIPTION DESCRIPTION The Digital Acceleration (DA) team in India is seeking a talented, self-driven Applied Scientist to work on prototyping, optimizing, and deploying ML algorithms for solving Digital businesses problems. Key job responsibilities Research, experiment and build Proof Of Concepts advancing the state of the art in AI & ML. Collaborate with cross-functional teams to architect and execute technically rigorous AI projects. Thrive in dynamic environments, adapting quickly to evolving technical requirements and deadlines. Engage in effective technical communication (written & spoken) with coordination across teams. Conduct thorough documentation of algorithms, methodologies, and findings for transparency and reproducibility. Publish research papers in internal and external venues of repute Support on-call activities for critical issues Basic Qualifications Experience building machine learning models or developing algorithms for business application PhD, or a Master's degree and experience in CS, CE, ML or related field Knowledge of programming languages such as C/C++, Python, Java or Perl Experience in any of the following areas: algorithms and data structures, parsing, numerical optimization, data mining, parallel and distributed computing, high-performance computing Proficiency in coding and software development, with a strong focus on machine learning frameworks. Understanding of relevant statistical measures such as confidence intervals, significance of error measurements, development and evaluation data sets, etc. Excellent communication skills (written & spoken) and ability to collaborate effectively in a distributed, cross-functional team setting. Preferred Qualifications 5+ years of building machine learning models or developing algorithms for business application experience Have publications at top-tier peer-reviewed conferences or journals Track record of diving into data to discover hidden patterns and conducting error/deviation analysis Ability to develop experimental and analytic plans for data modeling processes, use of strong baselines, ability to accurately determine cause and effect relations Exceptional level of organization and strong attention to detail Comfortable working in a fast paced, highly collaborative, dynamic work environment BASIC QUALIFICATIONS 5+ years of building machine learning models for business application experience PhD, or Master's degree and 10+ years of applied research experience Experience programming in Java, C++, Python or related language Experience with neural deep learning methods and machine learning PREFERRED QUALIFICATIONS Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ Job ID: A2848605 Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Splunk, a Cisco company, is driving the future of digital resilience with a powerful, unified security and observability platform built for hybrid, multi-cloud environments. Our technology is trusted by the world’s leading organizations—but what truly sets us apart is our people. We celebrate individuality, curiosity, and purpose. If you’re a builder at heart with a passion for high-scale, mission-critical systems and a background in cybersecurity, we’d love to meet you. The Role As a Senior Software Engineer – Fullstack , you’ll bring end-to-end engineering expertise with a strong emphasis on scalable backend development and distributed systems, while also contributing to frontend enhancements and UI fixes . You’ll help develop intelligent, ML-driven security features and ensure our cloud-native applications are secure, performant, and resilient at scale. Working from India , you’ll collaborate with global engineering teams to build enterprise-grade solutions used by some of the world’s largest organizations. What You'll Do Design and build robust backend components for large-scale, distributed cybersecurity platforms using Java, Scala, Python, and Node.js. Tackle frontend development for smaller features and bug fixes using JavaScript and JQuery. Troubleshoot production issues across the full stack—from the database to the UI—and partner with customers and stakeholders to resolve them efficiently. Partner closely with customers to identify and resolve infrastructure pain points, while elevating data clarity and the value of delivered security insights. Build and maintain machine learning-driven applications that leverage big data technologies like Spark, Hadoop, Hive, and Impala for real-time cybersecurity insights. Develop and maintain CI/CD pipelines using GitLab, with automation for building, testing, and deploying secure, high-quality software. Write automated tests, drive code coverage, and conduct performance testing to ensure application reliability. Work on security compliance initiatives including FIPS 140-2/3 and STIG requirements. Collaborate across SRE, infrastructure, data, and security teams to improve performance, scalability, and observability. Monitor and analyze production systems using Splunk SPL, and other observability tools. What You Bring 8+ years of software development experience, with deep backend expertise and full stack exposure. Strong programming skills in Java, Scala, Python, Node.js, and scripting languages like Shell. Hands-on experience with Linux environments including Red Hat, Ubuntu, and Oracle Enterprise Linux. Proven experience with distributed systems, Kafka, Zookeeper, and Protobuf. Expertise in containerization and orchestration using Docker and Kubernetes. Proficient with GitLab CI/CD, infrastructure automation, and test frameworks. Familiarity with frontend technologies including JavaScript and JQuery. Understanding of security frameworks such as FIPS and STIG. Strong analytical and communication skills; able to explain complex issues to technical and non-technical audiences. Agile development experience and ability to work effectively across global, cross-functional teams. Nice to Have Familiarity with Splunk SPL, and other observability tools. Experience developing ML-based applications in the cybersecurity space. Exposure to performance tuning, incident response, and monitoring best practices in production environments. We value diversity, equity, and inclusion at Splunk and are an equal employment opportunity employer. Qualified applicants receive consideration for employment without regard to race, religion, color, national origin, ancestry, sex, gender, gender identity, gender expression, sexual orientation, marital status, age, physical or mental disability or medical condition, genetic information, veteran status, or any other consideration made unlawful by federal, state, or local laws. We consider qualified applicants with criminal histories, consistent with legal requirements. Note Show more Show less
Posted 1 week ago
5.5 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Role: Senior Associate Tower: Data, Analytics & Specialist Managed Service Experience: 5.5 - 10 years Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: Bangalore Job Description As a Senior Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: AWS Cloud Engineer Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 6 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 3-5 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modeling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient ETL/ELT processes using industry leading tools like AWS, AWS GLUE, AWS Lambda, AWS DMS, PySpark, SQL, Python, DBT, Prefect, Snowflake, etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in AWS. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have AWS certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications Bachelor's degree in Engineering or a related field. 3 to 5 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are looking for an immediate joiner and experienced Big Data Developer with a strong background in Kafka, PySpark, Python/Scala, Spark, SQL, and the Hadoop ecosystem. The ideal candidate should have over 5 years of experience and be ready to join immediately. This role requires hands-on expertise in big data technologies and the ability to design and implement robust data processing solutions. Responsibilities Design, develop, and maintain scalable data processing pipelines using Kafka, PySpark, Python/Scala, and Spark. Work extensively with the Kafka and Hadoop ecosystem, including HDFS, Hive, and other related technologies. Write efficient SQL queries for data extraction, transformation, and analysis. Implement and manage Kafka streams for real-time data processing. Utilize scheduling tools to automate data workflows and processes. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality and integrity by implementing robust data validation processes. Optimize existing data processes for performance and scalability. Requirements Experience with GCP. Knowledge of data warehousing concepts and best practices. Familiarity with machine learning and data analysis tools. Understanding of data governance and compliance standards. This job was posted by Arun Kumar K from krtrimaIQ Cognitive Solutions. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Work Your Magic with us! Ready to explore, break barriers, and discover more? We know you’ve got big plans – so do we! Our colleagues across the globe love innovating with science and technology to enrich people’s lives with our solutions in Healthcare, Life Science, and Electronics. Together, we dream big and are passionate about caring for our rich mix of people, customers, patients, and planet. That`s why we are always looking for curious minds that see themselves imagining the unimageable with us. Your Role As a Sr Data Engineer in the Digital & Data team you will work hands-on to deliver and maintain the pipelines required by the business functions to derive value from their data. For this, you will bring data from a varied landscape of source systems into our cloud-based analytics stack and implement necessary cleaning and pre-processing steps in close collaboration with our business customers. Furthermore, you will work closely together with our teams to ensure that all data assets are governed according to the FAIR principles. To keep the engineering team scalable, you and your peers will create reusable components, libraries, and infrastructure that will be used to accelerate the pace with which future use-cases can be delivered. You will be part of a team dedicated to delivering state-of-the-art solutions for enabling data analytics use cases across the Healthcare sector of a leading, global Science & Technology company. As such, you will have the unique opportunity to gain insight into our diverse business functions allowing you to expand your skills in various technical, scientific, and business domains. Working in a project-based way covering a multitude of data domains and technological stacks, you will be able to significantly develop your skills and experience as a Data Engineer. Who You Are BE/M.Sc./PhD in Computer Science or related field and 8+ years of work experience in a relevant capacity Experience in working with cloud environments such as, Hadoop, AWS, GCP, and Azure. Experience with enforcing security controls and best practices to protect sensitive data within AWS data pipelines, including encryption, access controls, and auditing mechanisms. Agile mindset, a spirit of initiative, and desire to work hands-on together with your team Interest in solving challenging technical problems and developing the future data architecture that will enable the implementation of innovative data analytics use-cases Experience in leading small to medium-sized team. Experience in creating architectures for ETL processes for batch as well as streaming Ingestion Knowledge of designing and validating software stacks for GxP relevant contexts as well as working with PII data Familiarity with the data domains covering the Pharma value-chain (e.g. research, clinical, regulatory, manufacturing, supply chain, and commercial) Strong, hands-on experience in working with Python, Pyspark & R codebases, proficiency in additional programming languages (e.g. C/C++, Rust, Typescript, Java, …) is expected. Experience working with Apache Spark and the Hadoop ecosystem Working with heterogenous compute environments and multi-platform setups Experience in working with cloud environments such as, Hadoop, AWS, GCP, and Azure Basic knowledge of Statistics and Machine Learning algorithms is favorable This is the respective role description: The ability to easily find, access, and analyze data across an organization is key for every modern business to be able to efficiently make decisions, optimize processes, and to create new business models. The Data Architect plays a key role in unlocking this potential by defining and implementing a harmonized data architecture for Healthcare. What we offer: We are curious minds that come from a broad range of backgrounds, perspectives, and life experiences. We celebrate all dimensions of diversity and believe that it drives excellence and innovation, strengthening our ability to lead in science and technology. We are committed to creating access and opportunities for all to develop and grow at your own pace. Join us in building a culture of inclusion and belonging that impacts millions and empowers everyone to work their magic and champion human progress! Apply now and become a part of our diverse team! Show more Show less
Posted 1 week ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Amazon IN Platform Development team is looking to hire a rock star Data/BI Engineer to build for pan Amazon India businesses. Amazon India is at the core of hustle @ Amazon WW today and the team is charted with democratizing data access for the entire marketplace & add productivity. That translates to owning the processing of every Amazon India transaction, for which the team is organized to have dedicated business owners & processes for each focus area. The BI Engineer will play a key role in contributing to the success of each focus area, by partnering with respective business owners and leveraging data to identify areas of improvement & optimization. He / She will build deliverables like business process automation, payment behavior analysis, campaign analysis, fingertip metrics, failure prediction etc. that provide edge to business decision making AND can scale with growth. The role sits in the sweet spot between technology and business worlds AND provides opportunity for growth, high business impact and working with seasoned business leaders. An ideal candidate will be someone with sound technical background in data domain – storage / processing / analytics, has solid business acumen and a strong automation / solution oriented thought process. Will be a self-starter who can start with a business problem and work backwards to conceive & devise best possible solution. Is a great communicator and at ease on partnering with business owners and other internal / external teams. Can explore newer technology options, if need be, and has a high sense of ownership over every deliverable by the team. Is constantly obsessed with customer delight & business impact / end result and ‘gets it done’ in business time. Key job responsibilities Design, implement and support an data infrastructure for analytics needs of large organization Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big data technologies Be enthusiastic about building deep domain knowledge about Amazon’s business. Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. Enjoy working closely with your peers in a group of very smart and talented engineers. Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency About The Team India Data Engineering and Analytics (IDEA) team is central data engineering team for Amazon India. Our vision is to simplify and accelerate data driven decision making for Amazon India by providing cost effective, easy & timely access to high quality data. We achieve this by building UDAI (Unified Data & Analytics Infrastructure for Amazon India) which serves as a central data platform and provides data engineering infrastructure, ready to use datasets and self-service reporting capabilities. Our core responsibilities towards India marketplace include a) providing systems(infrastructure) & workflows that allow ingestion, storage, processing and querying of data b) building ready-to-use datasets for easy and faster access to the data c) automating standard business analysis / reporting/ dashboarding d) empowering business with self-service tools for deep dives & insights seeking. Basic Qualifications 1+ years of data engineering experience Experience with SQL Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Knowledge of AWS Infrastructure Knowledge of basics of designing and implementing a data schema like normalization, relational model vs dimensional model Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2984267 Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Ninja Van is a late-stage logtech startup that is disrupting a massive industry with innovation and cutting edge technology. Launched 2014 in Singapore, we have grown rapidly to become one of Southeast Asia's largest and fastest-growing express logistics companies. Since our inception, we’ve delivered to 100 million different customers across the region with added predictability, flexibility and convenience. Join us in our mission to connect shippers and shoppers across Southeast Asia to a world of new possibilities. More About Us We process 250 million API requests and 3TB of data every day. We deliver more than 2 million parcels every day. 100% network coverage with 2600+ hubs and stations in 6 SEA markets (Singapore, Malaysia, Indonesia, Thailand, Vietnam and Philippines), reaching 500 million consumers. 2 Million active shippers in all e-commerce segments, from the largest marketplaces to the individual social commerce sellers. Raised more than US$500 million over five rounds. We are looking for world-class talent to join our crack team of engineers, product managers and designers. We want people who are passionate about creating software that makes a difference to the world. We like people who are brimming with ideas and who take initiative rather than wait to be told what to do. We prize team-first mentality, personal responsibility and tenacity to solve hard problems and meet deadlines. As part of a small and lean team, you will have a very direct impact on the success of the company. Roles & Responsibilities Design, develop, and maintain Ninja Van's infrastructure for data streaming, processing, and storage . Build tools to ensure effective maintenance and monitoring of the data infrastructure. Contribute to key architectural decisions for data pipelines and lead the implementation of major initiatives. Collaborate with stakeholders to deliver scalable and high-performance solutions for data requirements, including extraction, transformation, and loading (ETL) from diverse data sources. Enhance the team's data capabilities by sharing knowledge , enforcing best practices , and promoting data-driven decision-making . Develop and enforce Ninja Van's data retention policies and backup strategies, ensuring data is stored redundantly and securely. Requirements Solid computer science fundamentals, excellent problem-solving skills, and a strong understanding of distributed computing principles. At least 8+ years of experience in a similar role, with a proven track record of building scalable and high-performance data infrastructure using Python, PySpark, Spark, and Airflow. Expert-level SQL knowledge and extensive experience working with both relational and NoSQL databases. Advanced knowledge of Apache Kafka, along with demonstrated proficiency in Hadoop v2, HDFS, and MapReduce. Hands-on experience with stream-processing systems (e.g., Storm, Spark Streaming), big data querying tools (e.g., Pig, Hive, Spark), and data serialization frameworks (e.g., Protobuf, Thrift, Avro). [Good to have] Familiarity with infrastructure-as-code technologies like Terraform, Terragrunt, Ansible, or Helm. Don’t worry if you don’t have this experience—what matters is your interest in learning! [Good to have] Experience with Change Data Capture (CDC) technologies such as Maxwell or Debezium. Bachelor’s or Master’s degree in Computer Science or a related field from a top university. Tech Stack Backend: Play (Java 8+), Golang, Node.js , Python, FastAPI Frontend: AngularJS, ReactJS Mobile: Android, Flutter, React Native Cache: Hazelcast, Redis Data storage: MySQL, TiDB, Elasticsearch, Delta Lake Infrastructure monitoring: Prometheus, Grafana Orchestrator: Kubernetes Containerization: Docker, Containerd Cloud Provider: GCP, AWS Data pipelines: Apache Kafka, Spark Streaming, Maxwell/Debezium, PySpark, TiCDC Workflow manager: Apache Airflow Query engines: Apache Spark, Trino Submit a job application By applying to the job, you acknowledge that you have read, understood and agreed to our Privacy Policy Notice (the “Notice”) and consent to the collection, use and/or disclosure of your personal data by Ninja Logistics Pte Ltd (the “Company”) for the purposes set out in the Notice. In the event that your job application or personal data was received from any third party pursuant to the purposes set out in the Notice, you warrant that such third party has been duly authorised by you to disclose your personal data to us for the purposes set out in the the Notice. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
At Elanco (NYSE: ELAN) – it all starts with animals! As a global leader in animal health, we are dedicated to innovation and delivering products and services to prevent and treat disease in farm animals and pets. We’re driven by our vision of ‘Food and Companionship Enriching Life’ and our approach to sustainability – the Elanco Healthy Purpose™ – to advance the health of animals, people, the planet and our enterprise. At Elanco, we pride ourselves on fostering a diverse and inclusive work environment. We believe that diversity is the driving force behind innovation, creativity, and overall business success. Here, you’ll be part of a company that values and champions new ways of thinking, work with dynamic individuals, and acquire new skills and experiences that will propel your career to new heights. Making animals’ lives better makes life better – join our team today! Role & Responsibilities Provide data engineering subject matter expertise and hands-on data- capture, ingestion, curation, and pipeline development expertise on Azure to deliver cloud optimized data solutions. Provide expert data PaaS on Azure storage; big data platform services; server-less architectures; Azure SQL DB; NoSQL databases and secure, automated data pipelines. Participate in data/data-pipeline architectural discussions to help build cloud native solutions or migrate existing data applications from on premise to Azure platform. Perform current state “AS-IS” and future state “To-Be” analysis. Participate and help develop data engineering community of practice as a global go-to expert panel/resource. Develop and evolve new or existing data engineering methods and procedures to create possible alternative, agile solutions to moderately complex problems. Stay abreast with new and emerging data engineering technologies, tools, methodologies, and patterns on Azure and other major public clouds. Demonstrate ownership in understanding the organization’s strategic direction as it relates to your team and individual goals. Work collaboratively and use sound judgment in developing robust solution while seeking guidance on complex problems. Basic Qualifications (Must Have) Bachelors or higher degree in Computer Science or a related discipline. At least 2 years of data pipeline and data product design, development, delivery experience and deploying ETL/ELT solutions on Azure Data Factory. Azure native data/big-data tools, technologies and services experience including – Storage BLOBS, ADLS, Azure SQL DB, COSMOS DB, NoSQL and SQL Data Warehouse. Sound problem solving skills in developing data pipelines using Data Bricks, Stream Analytics and PowerBI. Minimum of 2 years of hands-on experience in programming languages, Azure and Big Data technologies such as PowerShell, C#, Java, Python, Scala, SQL, ADLS/Blob, Hadoop, Spark/SparkSQL, Hive, and streaming technologies like Kafka, EventHub etc. Knowledge on Distributed System Elanco is an EEO/Affirmative Action Employer and does not discriminate on the basis of age, race, color, religion, gender, sexual orientation, gender identity, gender expression, national origin, protected veteran status, disability or any other legally protected status Show more Show less
Posted 1 week ago
4.0 - 6.0 years
4 - 8 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-217399 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jun. 06, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist Software Engineer What you will do Let’s do this. Let’s change the world. You will play a key role as part of Operations Generative AI (GenAI) Product team to deliver cutting edge innovative GEN AI solutions across various Process Development functions (Drug Substance, Drug Product, Attribute Sciences & Combination Products) in Operations functions. The role involves developing, implementing and sustaining GEN AI solutions to help find relevant, actionable information quickly and accurately. . Role Description: The Specialist Software Engineer is responsible for designing, developing, and maintaining GEN AI solutions software applications and solutions that meet business needs and ensure high availability and performance of critical systems and applications in Process development under Operation. This role involves working closely with Data Scientists, business SME’s, and other engineers to create high-quality, scalable GEN AI software solutions to help find relevant, actionable information quickly and accurately, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Take ownership of complex software projects from conception to deployment, Manage software delivery scope, risk, and timeline. Rapidly prototype concepts into working code. Provide technical guidance and mentorship to junior developers. Contribute to front-end and back-end development using cloud technology. Develop innovative solutions using generative AI technologies. Integrate with other systems and platforms to ensure seamless data flow and functionality. Conduct code reviews to ensure code quality and adherence to best practices. Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations. Analyze and understand the functional and technical requirements of applications, solutions, and systems and translate them into software architecture and design specifications. Work closely with product team, cross-functional teams, enterprise technology teams and QA, to deliver high-quality and compliant software on time. Ensure high quality software deliverables free of bugs and performance issues through proper design and comprehensive testing strategies. Provide ongoing support and maintenance for applications, ensuring that they operate smoothly and efficiently. Architect and lead the development of scalable, intelligent search systems leveraging NLP, embeddings, LLMs, and vector search Own the end-to-end lifecycle of search solutions, from ingestion and indexing to ranking, relevancy tuning, and UI integration Integrate AI models that improve search precision, query understanding, and result summarization (e.g., generative answers via LLMs). Develop solutions for handling structured/unstructured data in AI pipelines. Partner with platform teams to deploy search solutions on scalable infrastructure (e.g., Kubernetes, Databricks). Experience in integrating Generative AI capabilities and Vision Models to enrich content quality and user engagement. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree with 4 - 6 years of experience in Computer Science, IT or related field OR Bachelor’s degree with 6 - 8 years of experience in Computer Science, IT or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT or related field Experience in Python, Java, AI/ML based Python libraries(PyTorch), Experienced with Web frameworks like Flask, Django, Fast API Experience with design patterns, data structures, data modelling, data algorithms Familiarity with MLOps, CI/CD for ML, and monitoring of AI models in production. Experienced with AWS /Azure Platform, building and deploying the code Experience in PostgreSQL /Mongo DB SQL database, vector database for large language models, Databricks or RDS, S3 Buckets Experience with popular large language models Experience with Retrieval-augmented generation (RAG) framework, AI Agents, Vector stores, AI/ML platforms, embedding models ex Open AI, Langchain, Redis, pgvector Experience with prompt engineering, model fine tuning Experience with generative AI or retrieval-augmented generation (RAG) frameworks Experience in Agile software development methodologies Experience in End-to-End testing as part of Test-Driven Development Preferred Qualifications: Strong understanding of cloud platforms (e.g., AWS, GCP, Azure) and containerization technologies (e.g., Docker, Kubernetes). Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Splunk). Experience with data processing tools like Hadoop, Spark, or similar. Experience with Langchain or llamaIndex framework for language models; Experience with prompt engineering, model fine-tuning. Experience working on Full stack Applications Professional Certifications: AWS, Data Science Certifications(preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC Responsibilities About the role: As a Junior/Senior Data Engineer, you'll be taking the lead in designing and maintaining complex data ecosystems. Your experience will be instrumental in optimizing data processes, ensuring data quality, and driving data-driven decision-making within the organization. Architecting and designing complex data systems and pipelines. Leading and mentoring junior data engineers and team members. Collaborating with cross-functional teams to define data requirements. Implementing advanced data quality checks and ensuring data integrity. Optimizing data processes for efficiency and scalability. Overseeing data security and compliance measures. Evaluating and recommending new technologies to enhance data infrastructure. Providing technical expertise and guidance for critical data projects. Required Skills & Experience Proficiency in designing and building complex data pipelines and data processing systems. Leadership and mentorship capabilities to guide junior data engineers and foster skill development. Strong expertise in data modeling and database design for optimal performance. Skill in optimizing data processes and infrastructure for efficiency, scalability, and cost-effectiveness. Knowledge of data governance principles, ensuring data quality, security, and compliance. Familiarity with big data technologies like Hadoop, Spark, or NoSQL. Expertise in implementing robust data security measures and access controls. Effective communication and collaboration skills for cross-functional teamwork and defining data requirements. Skills Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc., Mandatory Skill Sets Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Preferred Skill Sets Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Years Of Experience Required 4-7years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
10.0 years
6 - 9 Lacs
Hyderābād
On-site
Lead, Software Engineering Hyderabad, India Information Technology 313257 Job Description About The Role: Grade Level (for internal use): 11 The Team: Our team is responsible for the design, architecture, and development of our client facing applications using a variety of tools that are regularly updated as new technologies emerge. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. The Impact: The work you do will be used every single day, it’s the essential code you’ll write that provides the data and analytics required for crucial, daily decisions in the capital and commodities markets. What’s in it for you: Build a career with a global company. Work on code that fuels the global financial markets. Grow and improve your skills by working on enterprise level products and new technologies. Responsibilities: Solve problems, analyze and isolate issues. Provide technical guidance and mentoring to the team and help them adopt change as new processes are introduced. Champion best practices and serve as a subject matter authority. Develop solutions to develop/support key business needs. Engineer components and common services based on standard development models, languages and tools Produce system design documents and lead technical walkthroughs Produce high quality code Collaborate effectively with technical and non-technical partners As a team-member should continuously improve the architecture Basic Qualifications: 10-12 years of experience designing/building data-intensive solutions using distributed computing. Proven experience in implementing and maintaining enterprise search solutions in large-scale environments. Experience working with business stakeholders and users, providing research direction and solution design and writing robust maintainable architectures and APIs. Experience developing and deploying Search solutions in a public cloud such as AWS. Proficient programming skills at a high-level languages -Java, Scala, Python Solid knowledge of at least one machine learning research frameworks Familiarity with containerization, scripting, cloud platforms, and CI/CD. 5+ years’ experience with Python, Java, Kubernetes, and data and workflow orchestration tools 4+ years’ experience with Elasticsearch, SQL, NoSQL, Apache spark, Flink, Databricks and Mlflow. Prior experience with operationalizing data-driven pipelines for large scale batch and stream processing analytics solutions Good to have experience with contributing to GitHub and open source initiatives or in research projects and/or participation in Kaggle competitions Ability to quickly, efficiently, and effectively define and prototype solutions with continual iteration within aggressive product deadlines. Demonstrate strong communication and documentation skills for both technical and non-technical audiences. Preferred Qualifications: Search Technologies: Query and Indexing content for Apache Solr, Elastic Search, etc. Proficiency in search query languages (e.g., Lucene Query Syntax) and experience with data indexing and retrieval. Experience with machine learning models and NLP techniques for search relevance and ranking. Familiarity with vector search techniques and embedding models (e.g., BERT, Word2Vec). Experience with relevance tuning using A/B testing frameworks. Big Data Technologies: Apache Spark, Spark SQL, Hadoop, Hive, Airflow Data Science Search Technologies: Personalization and Recommendation models, Learn to Rank (LTR) Preferred Languages: Python, Java Database Technologies: MS SQL Server platform, stored procedure programming experience using Transact SQL. Ability to lead, train and mentor. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Inclusive Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering an inclusive workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and equal opportunity, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 313257 Posted On: 2025-03-30 Location: Hyderabad, Telangana, India
Posted 1 week ago
0 years
0 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: This role will be responsible for: Develop and support new feeds ingestion / understand the existing framework and do the development as per the business rules and requirements. Development and maintenance of new changes / enhancements in Data Ingestion / Juniper and promoting and supporting those in the production environment within the stipulated timelines. Need to get familiar with the Data Ingestion / Data Refinery / Common Data Model / Compdata frameworks quickly and contribute to the application development as soon as possible. Methodical and measured approach with a keen eye for attention to detail; Ability to work under pressure and remain calm in the face of adversity; Ability to collaborate, interact and engage with different business, technical and subject matter experts; Good, concise, written and verbal communication Ability to manage workload from multiple requests and to balance priorities; Pro-active, a can do mind-set and attitude; Good documentation skills Requirements To be successful in this role, you should meet the following requirements: Experience (1 = essential, 2 = very useful, 3 = nice to have): 1. Hadoop / Hive / GCP 2. Agile / Scrum 3. LINUX Technical skills (1 = essential, 2 = useful, 3 = nice to have): 1. Any ETL tool 1. Analytical trouble shooting. 2. Hive QL 1. On-Prem / Cloud infra knowledgeYou’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 1 week ago
3.0 - 5.0 years
4 - 7 Lacs
Hyderābād
On-site
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Role Purpose The purpose of the role is to resolve, maintain and manage client’s software/ hardware/ network based on the service requests raised from the end-user as per the defined SLA’s ensuring client satisfaction ͏ Do Ensure timely response of all the tickets raised by the client end user Service requests solutioning by maintaining quality parameters Act as a custodian of client’s network/ server/ system/ storage/ platform/ infrastructure and other equipment’s to keep track of each of their proper functioning and upkeep Keep a check on the number of tickets raised (dial home/ email/ chat/ IMS), ensuring right solutioning as per the defined resolution timeframe Perform root cause analysis of the tickets raised and create an action plan to resolve the problem to ensure right client satisfaction Provide an acceptance and immediate resolution to the high priority tickets/ service Installing and configuring software/ hardware requirements based on service requests 100% adherence to timeliness as per the priority of each issue, to manage client expectations and ensure zero escalations Provide application/ user access as per client requirements and requests to ensure timely solutioning Track all the tickets from acceptance to resolution stage as per the resolution time defined by the customer Maintain timely backup of important data/ logs and management resources to ensure the solution is of acceptable quality to maintain client satisfaction Coordinate with on-site team for complex problem resolution and ensure timely client servicing Review the log which Chat BOTS gather and ensure all the service requests/ issues are resolved in a timely manner ͏ Deliver No Performance Parameter Measure 1. 100% adherence to SLA/ timelines Multiple cases of red time Zero customer escalation Client appreciation emails ͏ ͏ Mandatory Skills: Hadoop Admin. Experience: 3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
6.0 years
9 - 10 Lacs
Gurgaon
On-site
Senior Engineer, Software Engineering Gurgaon, India; Hyderabad, India Information Technology 315230 Job Description About The Role: Grade Level (for internal use): 10 Position Title : Senior Software Developer The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities : Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. What we’re Looking For : Basic Qualifications : Bachelor's degree in Computer Science or Equivalent 6+ years’ related experience Passionate, smart, and articulate developer Strong C#, WPF and SQL skills Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Interest and experience in Environmental and Sustainability content is a plus Agile/Scrum experience a plus Exposure to Data Engineering & Big Data technologies like Hadoop, Spark/Scala, Nifi & ETL is a plus Preferred Qualifications : Experience on Docker is a plus Experience working in cloud computing environments such as AWS, Azure or GCP Experience with large scale messaging systems such as Kafka or RabbitMQ or commercial systems. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 315230 Posted On: 2025-06-06 Location: Gurgaon, Haryana, India
Posted 1 week ago
0 years
3 - 5 Lacs
Gurgaon
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Consultant, Performance Analytics, Advisors & Consulting Services Advisors & Consulting Services Services within Mastercard is responsible for acquiring, engaging, and retaining customers by managing fraud and risk, enhancing cybersecurity, and improving the digital payments experience. We provide value-added services and leverage expertise, data-driven insights, and execution. Our Advisors & Consulting Services team combines traditional management consulting with Mastercard’s rich data assets, proprietary platforms, and technologies to provide clients with powerful strategic insights and recommendations. Our teams work with a diverse global customer base across industries, from banking and payments to retail and restaurants. The Advisors & Consulting Services group has five specializations: Strategy & Transformation, Performance Analytics, Business Experimentation, Marketing, and Program Management. Our Performance Analytics consultants translate data into insights by leveraging Mastercard and customer data to design, implement, and scale analytical solutions for customers. They use qualitative and quantitative analytical techniques and enterprise applications to synthesize analyses into clear recommendations and impactful narratives. Positions for different specializations and levels are available in separate job postings. Please review our consulting specializations to learn more about all opportunities and apply for the position that is best suited to your background and experience: https://careers.mastercard.com/us/en/consulting-specializations-at-mastercard Roles and Responsibilities Client Impact Provide creative input on projects across a range of industries and problem statements Contribute to the development of analytics strategies and programs for regional and global clients by leveraging data and technology solutions to unlock client value Collaborate with Mastercard team to understand clients’ needs, agenda, and risks Develop working relationship with client analysts/managers, and act as trusted and reliable partner Team Collaboration & Culture Collaborate with senior project delivery consultants to identify key findings, prepare effective presentations, and deliver recommendations to clients Independently identify trends, patterns, issues, and anomalies in defined area of analysis, and structure and synthesize own analysis to highlight relevant findings Lead internal and client meetings, and contribute to project management Contribute to the firm's intellectual capital Receive mentorship from performance analytics leaders for professional growth and development Qualifications Basic qualifications Undergraduate degree with data and analytics experience in business intelligence and/or descriptive, predictive, or prescriptive analytics Experience managing clients or internal stakeholders Ability to analyze large datasets and synthesize key findings Proficiency using data analytics software (e.g., Python, R, SQL, SAS) Advanced Word, Excel, and PowerPoint skills Ability to perform multiple tasks with multiple clients in a fast-paced, deadline-driven environment Ability to communicate effectively in English and the local office language (if applicable) Eligibility to work in the country where you are applying, as well as apply for travel visas as required by travel needs Preferred qualifications Additional data and analytics experience in building, managing, and maintaining database structures, working with data visualization tools (e.g., Tableau, Power BI), or working with Hadoop framework and coding using Impala, Hive, or PySpark Ability to analyze large datasets and synthesize key findings to provide recommendations via descriptive analytics and business intelligence Experience managing tasks or workstreams in a collaborative team environment Ability to identify problems, brainstorm and analyze answers, and implement the best solutions Relevant industry expertise Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 1 week ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design, develop, and optimize data pipelines and ETL processes using PySpark or Scala to extract, transform, and load large volumes of structured and unstructured data from diverse sources. Implement data ingestion, processing, and storage solutions on Azure cloud platform, leveraging services such as Azure Databricks, Azure Data Lake Storage, and Azure Synapse Analytics. Develop and maintain data models, schemas, and metadata to support efficient data access, query performance, and analytics requirements. Monitor pipeline performance, troubleshoot issues, and optimize data processing workflows for scalability, reliability, and cost-effectiveness. Implement data security and compliance measures to protect sensitive information and ensure regulatory compliance. Requirement Proven experience as a Data Engineer, with expertise in building and optimizing data pipelines using PySpark, Scala, and Apache Spark. Hands-on experience with cloud platforms, particularly Azure, and proficiency in Azure services such as Azure Databricks, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database. Strong programming skills in Python and Scala, with experience in software development, version control, and CI/CD practices. Familiarity with data warehousing concepts, dimensional modeling, and relational databases (e.g., SQL Server, PostgreSQL, MySQL). Experience with big data technologies and frameworks (e.g., Hadoop, Hive, HBase) is a plus. Mandatory Skill Sets Spark, Pyspark, Azure Preferred Skill Sets Spark, Pyspark, Azure Years Of Experience Required 4 - 8 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills PySpark Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 week ago
5.0 years
6 - 9 Lacs
Bengaluru
On-site
As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Career Level - IC3 As a Sr. Support Engineer, you will be the technical interface to customers, Original Equipment Manufacturers (OEMs) and Value-Added Resellers (VARs) for resolution of problems related to the installation, recommended maintenance and use of Oracle products. Have an understanding of all Oracle products in their competencies and in-depth knowledge of several products and/or platforms. Also, you should be highly experienced in multiple platforms and be able to complete assigned duties with minimal direction from management. In this position, you will routinely act independently while researching and developing solutions to customer issues. RESPONSIBILITIES: To manage and resolve Service Requests logged by customers (internal and external) on Oracle products and contribute to proactive support activities according to product support strategy and model Owning and resolving problems and managing customer expectations throughout the Service Request lifecycle in accordance with global standards Working towards, adopting and contributing to new processes and tools (diagnostic methodology, health checks, scripting tools, etc.) Contributing to Knowledge Management content creation and maintenance Working with development on product improvement programs (testing, SRP, BETA programs etc) as required Operating within Oracle business processes and procedures Respond and resolve customer issues within Key Performance Indicator targets Maintaining product expertise within the team Maintain an up-to-date and in-depth knowledge of new products released in the market for supported product QUALIFICATIONS: Bachelor’s degree in Computer Science, Engineering or related technical field 5+ years of proven professional and technical experience in Big Data Appliance (BDA), Oracle Cloud Infrastructure (OCI), Linux OS and within areas like Cloudera distribution for Hadoop (CDH), HDFS, YARN, Spark, Hive, Sqoop, Oozie and Intelligent Data Lake. Excellent verbal and written skills in English SKILLS & COMPETENCIES: Minimum technical skills: As a member of the Big Data Appliance (BDA), the focus is to troubleshoot highly complex technical issues related to the Big Data Appliance and within areas like Cloudera distribution for Hadoop (CDH), HDFS, YARN, Spark, Hive, Sqoop, Oozie and Intelligent Data Lake. Have good hands on experience in Linux Systems, Cloudera Hadoop architecture, administration and troubleshooting skills with good knowledge of different technology products/services/processes. Responsible for resolving complex issues for BDA (Big Data Appliance) customers. This would include resolving issues pertaining to Cloudera Hadoop, Big Data SQL, BDA upgrades/patches and installs. The candidate will also collaborate with other teams like Hardware, development, ODI, Oracle R, etc to help resolve customer’s issues on the BDA machine. The candidate will also be responsible for interacting with customer counterparts on a regular basis and serving as the technology expert on the customer’s behalf. Experience in multi-tier architecture environment required. Fundamental understanding of computer networking, systems, and database technologies. Personal competencies: Desire to learn, or expand knowledge, about Oracle database and associated products Customer focus Structured Problem Recognition and Resolution Experience of contributing to a shared knowledge base Experience of Support level work, like resolving customer problems and managing customer expectations, and escalations. Communication Planning and organizing Working globally Quality Team Working Results oriented
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
On-site
Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes Optimize data flow and collection for cross-functional teams Build infrastructure required for optimal extraction, transformation, and loading of data Ensure data quality, reliability, and integrity across all data systems Collaborate with data scientists and analysts to help implement models and algorithms Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc. Create and maintain comprehensive technical documentation Evaluate and integrate new data management technologies and tools Requirements 3-5 years of professional experience in data engineering roles Bachelor's degree in Computer Science, Engineering, or related field; Master's degree preferred Job Description Expert knowledge of SQL and experience with relational databases (e.g., PostgreSQL, Redshift, TIDB, MySQL, Oracle, Teradata) Extensive experience with big data technologies (e.g., Hadoop, Spark, Hive, Flink) Proficiency in at least one programming language such as Python, Java, or Scala Experience with data modeling, data warehousing, and building ETL pipelines Strong knowledge of data pipeline and workflow management tools (e.g., Airflow, Luigi, NiFi) Experience with cloud platforms (AWS, Azure, or GCP) and their data services. AWS Preferred Hands on Experience with building streaming pipelines with flink, Kafka, Kinesis. Flink Understanding of data governance and data security principles Experience with version control systems (e.g., Git) and CI/CD practices Preferred Skills Experience with containerization and orchestration tools (Docker, Kubernetes) Basic knowledge of machine learning workflows and MLOps Experience with NoSQL databases (MongoDB, Cassandra, etc.) Familiarity with data visualization tools (Tableau, Power BI, etc.) Experience with real-time data processing Knowledge of data governance frameworks and compliance requirements (GDPR, CCPA, etc.) Experience with infrastructure-as-code tools (Terraform, CloudFormation) Personal Qualities Strong problem-solving skills and attention to detail Excellent communication skills, both written and verbal Ability to work independently and as part of a team Proactive approach to identifying and solving problems
Posted 1 week ago
3.0 - 5.0 years
5 - 8 Lacs
Bengaluru
On-site
Your Job As a Data Analyst in Molex's Copper Solutions Business Unit software solution group, you will be to extracting actionable insights from large and complex manufacturing datasets, identifying trends, optimizing production processes, improving operational efficiency, minimizing downtime, and enhancing overall product quality. You will be collaborating closely with cross-functional teams to ensure the effective use of data in driving continuous improvement and achieving business objectives within the manufacturing environment. Our Team Molex's Copper Solutions Business Unit (CSBU) is a global team that works together to deliver exceptional products to worldwide telecommunication and data center customers. SSG under CSBU is one of the most highly technically advanced software solution group within Molex. Our group leverages software expertise to enhance the concept, design, manufacturing, and support of high-speed electrical interconnects. What You Will Do 1. Collect, clean, and transform data from various sources to support analysis and decision-making processes. 2. Conduct thorough data analysis using Python to uncover trends, patterns, and insights. 3. Create & maintain reports based on business needs. 4. Prepare comprehensive reports that detail analytical processes and outcomes. 5. Develop and maintain visualizations/dashboards. 6. Collaborate with cross-functional teams to understand data needs and deliver actionable insights. 7. Perform ad hoc analysis to support business decisions. 8. Write efficient and optimized SQL queries to extract, manipulate, and analyze data from various databases. 9. Identify gaps and inefficiencies in current reporting processes and implement improvements and new solutions. 10. Ensure data quality and integrity across all reports and tools. Who You Are (Basic Qualifications) B.E./B.Tech Degree in Computer Science Engineering, Information Science, Data Science or related discipline. 3-5 years of progressive data analysis experience with Python (pandas, numpy, matplotlib, OpenPyXL, SciPy , Statsmodels, Seaborn). What Will Put You Ahead . Experience with Power BI, Tableau, or similar tools for creating interactive dashboards and reports tailored for manufacturing operations. • Experience with predictive analytics e.g. machine learning models (e.g., using Scikit-learn) to predict failures, optimize production, or forecast demand. • Experience with big data tools like Hadoop, Apache Kafka, or cloud platforms (e.g., AWS, Azure) for managing and analyzing large-scale data. • Knowledge on A/B testing & forecasting. • Familiarity with typical manufacturing data (e.g., machine performance metrics, production line data, quality control metrics). At Koch companies, we are entrepreneurs. This means we openly challenge the status quo, find new ways to create value and get rewarded for our individual contributions. Any compensation range provided for a role is an estimate determined by available market data. The actual amount may be higher or lower than the range provided considering each candidate's knowledge, skills, abilities, and geographic location. If you have questions, please speak to your recruiter about the flexibility and detail of our compensation philosophy. Who We Are {Insert company language from Company Boilerplate Language Guide } At Koch, employees are empowered to do what they do best to make life better. Learn how our business philosophy helps employees unleash their potential while creating value for themselves and the company. Additionally, everyone has individual work and personal needs. We seek to enable the best work environment that helps you and the business work together to produce superior results.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The demand for Hadoop professionals in India has been on the rise in recent years, with many companies leveraging big data technologies to drive business decisions. As a job seeker exploring opportunities in the Hadoop field, it is important to understand the job market, salary expectations, career progression, related skills, and common interview questions.
These cities are known for their thriving IT industry and have a high demand for Hadoop professionals.
The average salary range for Hadoop professionals in India varies based on experience levels. Entry-level Hadoop developers can expect to earn between INR 4-6 lakhs per annum, while experienced professionals with specialized skills can earn upwards of INR 15 lakhs per annum.
In the Hadoop field, a typical career path may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually progressing to roles like Data Architect or Big Data Engineer.
In addition to Hadoop expertise, professionals in this field are often expected to have knowledge of related technologies such as Apache Spark, HBase, Hive, and Pig. Strong programming skills in languages like Java, Python, or Scala are also beneficial.
As you navigate the Hadoop job market in India, remember to stay updated on the latest trends and technologies in the field. By honing your skills and preparing diligently for interviews, you can position yourself as a strong candidate for lucrative opportunities in the big data industry. Good luck on your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2