Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Join our digital revolution in NatWest Digital X In everything we do, we work to one aim. To make digital experiences which are effortless and secure. So we organise ourselves around three principles: engineer, protect, and operate. We engineer simple solutions, we protect our customers, and we operate smarter. Our people work differently depending on their jobs and needs. From hybrid working to flexible hours, we have plenty of options that help our people to thrive. This role is based in India and as such all normal working days must be carried out in India. Job Description Join us as a Solution Architect This is an opportunity for an experienced Solution Architect to help us define the high-level technical architecture and design for a key data analytics and insights platform that powers the personalised customer engagement initiatives of the business You’ll define and communicate a shared technical and architectural vision of end-to-end designs that may span multiple platforms and domains Take on this exciting new challenge and hone your technical capabilities while advancing your career and building your network across the bank We're offering this role at vice president level What you'll do We’ll look to you to influence and promote the collaboration across platform and domain teams on the solution delivery. Partnering with platform and domain teams, you’ll elaborate the solution and its interfaces, validating technology assumptions, evaluating implementation alternatives, and creating the continuous delivery pipeline. You’ll also provide analysis of options and deliver end-to-end solution designs using the relevant building blocks, as well as producing designs for features that allow frequent incremental delivery of customer value. On top of this, you’ll be: Owning the technical design and architecture development that aligns with bank-wide enterprise architecture principles, security standards, and regulatory requirements Participating in activities to shape requirements, validating designs and prototypes to deliver change that aligns with the target architecture Promoting adaptive design practices to drive collaboration of feature teams around a common technical vision using continuous feedback Making recommendations of potential impacts to existing and prospective customers of the latest technology and customer trends Engaging with the wider architecture community within the bank to ensure alignment with enterprise standards Presenting solutions to governance boards and design review forums to secure approvals Maintaining up-to-date architectural documentation to support audits and risk assessment The skills you'll need As a Solution Architect, you’ll bring expert knowledge of application architecture, and in business data or infrastructure architecture with working knowledge of industry architecture frameworks such as TOGAF or ArchiMate. You’ll also need an understanding of Agile and contemporary methodologies with experience of working in Agile teams. A certification in cloud solutions like AWS Solution Architect is desirable while an awareness of agentic AI based application architectures using LLMs like OpenAI and agentic frameworks like LangGraph, CrewAI will be advantageous. Furthermore, you’ll need: Strong experience in solution design, enterprise architecture patterns, and cloud-native applications including the ability to produce multiple views to highlight different architectural concerns A familiarity with understanding big data processing in the banking industry Hands-on experience in AWS services, including but not limited to S3, Lambda, EMR, DynamoDB and API Gateway An understanding of big data processing using frameworks or platforms like Spark, EMR, Kafka, Apache Flink or similar Knowledge of real-time data processing, event-driven architectures, and microservices Conceptual understanding of data modelling and analytics, machine learning or deep-learning models The ability to communicate complex technical concepts clearly to peers and leadership level colleagues
Posted 1 week ago
3.0 - 4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Us CLOUDSUFI is a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance. Our Values We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community. Diversity & Inclusivity CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/ ; Summary We are seeking a talented and passionate Data Engineer to join our growing data team. In this role, you will be responsible for building, maintaining, and optimizing our data pipelines and infrastructure on Google Cloud Platform (GCP). The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and a passion for turning raw data into actionable insights. You will work closely with data scientists, analysts, and other engineers to support a variety of data-driven initiatives. Responsibilities Design, develop, and maintain scalable and reliable data pipelines using Dataform or DBT. Build and optimize data warehousing solutions on Google BigQuery. Develop and manage data workflows using Apache Airflow. Write complex and efficient SQL queries for data extraction, transformation, and analysis. Develop Python-based scripts and applications for data processing and automation. Collaborate with data scientists and analysts to understand their data requirements and provide solutions. Implement data quality checks and monitoring to ensure data accuracy and consistency. Optimize data pipelines for performance, scalability, and cost-effectiveness. Contribute to the design and implementation of data infrastructure best practices. Troubleshoot and resolve data-related issues. Stay up-to-date with the latest data engineering trends and technologies, particularly within the Google Cloud ecosystem. Qualifications Bachelor’s degree in computer science, a related technical field, or equivalent practical experience. 3-4 years of experience in a Data Engineer role. Strong expertise in SQL (preferably with BigQuery SQL). Proficiency in Python programming for data manipulation and automation. Hands-on experience with Google Cloud Platform (GCP) and its data services. Solid understanding of data warehousing concepts and ETL/ELT methodologies. Experience with Data form or DBT for data transformation and modeling. Experience with workflow management tools such as Apache Airflow. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Qualifications Google Cloud Professional Data Engineer certification. Knowledge of data modeling techniques (e.g., dimensional modeling, star schema). Familiarity with Agile development methodologies. Behavioral Competencies Should have very good verbal and written communication, technical articulation, listening and presentation skills Should have proven analytical and problem-solving skills Should have demonstrated effective task prioritization, time management and internal/external stakeholder management skills Should be a quick learner, self-starter, go-getter and team player Should have experience of working under stringent deadlines in a Matrix organization structure
Posted 1 week ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
What We Offer Our company culture is focused on helping our employees enable innovation by building breakthroughs together. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. Apply now! What You'll Do We are seeking a hands-on Product Manager with strong technical acumen and a passion for data engineering to drive the evolution of our data foundation capabilities. In this role, you will work closely with engineering, architecture, design, and go-to-market teams to define product requirements, shape roadmap priorities, and deliver impactful services that power the BDC platform. You will bring customer empathy, execution focus, and a collaborative mindset to ensure delivery of valuable outcomes for both internal and external stakeholders. Product Development & Execution Define and manage product requirements and use cases based on customer needs, stakeholder inputs, and technical feasibility Partner with engineering teams to deliver high-quality features on time and with measurable impact Prioritize and manage the product backlog, balancing short-term iterations with long-term strategic goals Support the creation of clear documentation, release notes, and user-facing communication Data-Driven Insights Use data, and user feedback to continuously improve product features and drive customer value Collaborate with teams to monitor adoption, measure impact, and identify opportunities Cross-Functional Collaboration Facilitate productive working relationships across BDC, SAP LOBs, and external partners Ensure alignment between technical teams and business stakeholders on product objectives Customer & Stakeholder Engagement Gather feedback directly from internal users, partners, and customers to validate hypotheses and inform future development Participate in customer calls, demos, and workshops to showcase capabilities and understand evolving needs What You Bring Experience: 4–6 years of product management experience in data engineering, platform, data integration or cloud services environments Technical Expertise: Strong background in data engineering, including hands-on experience with ETL, data pipelines, databases, and analytics platforms. Knowledge of Apache Spark, data lake, delta lake, cloud data warehouse, Object store technologies, and experience in building APIs for data sharing using “zero copy share” techniques such as Delta and Iceberg is highly desired. Customer Focus: Proven ability to translate user needs into product requirements and iterate quickly on feedback Execution Skills: Strong organizational, collaboration, interpersonal and planning skills with a bias toward action and delivery Communication Skills: Strong written and verbal communication skills, with the ability to articulate complex ideas clearly and effectively to both technical and non-technical audiences Education: Bachelor’s degree in Computer Science, Engineering, Data Science or related field. Advanced degree or MBA is a plu Meet Your Team SAP Business Data Cloud (BDC) is SAP’s next-generation data platform that brings together data from SAP and non-SAP sources into a unified, open, and business-ready environment. BDC enables organizations to harness the full power of their data with seamless integration, rich semantic context, and advanced governance capabilities. By providing trusted and connected data across landscapes, BDC empowers users to make better, faster, and more confident decisions. BDC Data Foundation Services is a forward-looking team at the heart of SAP’s Business Data Cloud (BDC) mission. We focus on building scalable, robust, and secure data product infrastructure services that empower customers with trusted, unified, and actionable data. As part of our growth journey, we are looking for a skilled and motivated Product Manager to join our team and contribute to the next wave of innovation in data foundations. We are SAP SAP innovations help more than 400,000 customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with 200 million users and more than 100,000 employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, we build breakthroughs, together. Our inclusion promise SAP’s culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone – regardless of background – feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: Americas: Careers.NorthAmerica@sap.com or Careers.LatinAmerica@sap.com, APJ: Careers.APJ@sap.com, EMEA: Careers@sap.com. EOE AA M/F/Vet/Disability Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID:430237 | Work Area: Solution and Product Management | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations:
Posted 1 week ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Company Description 👋🏼 We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (17500+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience 7+years. Extensive experience in back-end development utilizing Java 11, Spring Framework (Core/Boot/MVC), Hibernate/JPA. Good understanding of Data Structures, Object-Oriented Programming, and Design Patterns. Proficient in unit testing using JUnit or other frameworks. Expertise in REST APIs and Microservices Architecture. Hands-on experience with Docker. Working knowledge of Apache Kafka. Proficiency in working with Relational and NoSQL databases (Preferably PostgreSQL and MongoDB). Understanding of Behavior Driven Development (BDD) using tools like Cucumber. Working knowledge of containerization tools like Docker and orchestration tools like Kubernetes Exposure to cloud platforms, preferably Google Cloud Platform (GCP). Strong understanding of UML and design patterns. Strong problem-solving skills and a passion for continuous improvement. Excellent communication skills and the ability to collaborate effectively with cross-functional teams. RESPONSIBILITIES: Writing and reviewing great quality code Understanding functional requirements thoroughly and analyzing the clients needs in the context of the project Envisioning the overall solution for defined functional and non-functional requirements, and being able to define technologies, patterns and frameworks to realize it Determining and implementing design methodologies and tool sets Enabling application development by coordinating requirements, schedules, and activities. Being able to lead/support UAT and production roll outs Creating, understanding and validating WBS and estimated effort for given module/task, and being able to justify it Addressing issues promptly, responding positively to setbacks and challenges with a mindset of continuous improvement Giving constructive feedback to the team members and setting clear expectations. Helping the team in troubleshooting and resolving of complex bugs Coming up with solutions to any issue that is raised during code/design review and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field.
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Data Engineer -R&D-Multi Omics What You Will Do Let’s do this. Let’s change the world. In this vital role you will be responsible for development and maintenance of software in support of target/biomarker discovery at Amgen. Design, develop, and implement data pipelines, ETL/ELT processes, and data integration solutions Contribute to data pipeline projects from inception to deployment, manage scope, timelines, and risks Contribute to data models for biopharma scientific data, data dictionaries, and other documentation to ensure data accuracy and consistency Optimize large datasets for query performance Collaborate with global cross-functional teams including research scientists to understand data requirements and design solutions that meet business needs Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate with Data Architects, Business SMEs, Software Engineers and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Maintain documentation of processes, systems, and solutions What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. The role requires proficiency in scientific software development (e.g. Python, R, Rshiny, Plotly Dash, etc), and some knowledge of CI/CD processes and cloud computing technologies (e.g. AWS, Google Cloud, etc). Basic Qualifications: Master’s degree/Bachelors Degree and 5 to 9 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field experience. Preferred Qualifications: 5+ years of experience in designing and supporting biopharma scientific research data analytics (software platforms. Functional Skills: Must-Have Skills: Proficiency with SQL and Python for data engineering, test automation frameworks (pytest), and scripting tasks Hands on experience with big data technologies and platforms, such as Databricks (or equivalent), Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Excellent problem-solving skills and the ability to work with large, complex datasets Good-to-Have Skills: Experience the git, CICD and the software development lifecycle Experience with SQL and relational databases (e.g PostgreSQL, MySQL, Oracle) or Databricks Experience with cloud computing platforms and infrastructure (AWS preferred) Experience using and adopting Agile Framework A passion for tackling complex challenges in drug discovery with technology and data Basic understanding of data modeling, data warehousing, and data integration concepts Experience with data visualization tools (e.g. Dash, Plotly, Spotfire) Experience with diagramming and collaboration tools such as Miro, Lucidchart or similar tools for process mapping and brainstorming Experience writing and maintaining technical documentation in Confluence Professional Certifications: Databricks Certified Data Engineer Professional preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills High degree of initiative and self-motivation. Demonstrated presentation skills Ability to manage multiple priorities successfully. Team-oriented with a focus on achieving team goals. What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
12.0 - 17.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role Description: Let’s do this. Let’s change the world. We are looking for highly motivated expert Principal Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks with domain expertise in R&D domain. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Architect and maintain robust, scalable data pipelines using Databricks, Spark, and Delta Lake, enabling efficient batch and real-time processing. Lead efforts to evaluate, adopt, and integrate emerging technologies and tools that enhance productivity, scalability, and data delivery capabilities. Drive performance optimization efforts, including Spark tuning, resource utilization, job scheduling, and query improvements. Identify and implement innovative solutions that streamline data ingestion, transformation, lineage tracking, and platform observability. Build frameworks for metadata-driven data engineering, enabling reusability and consistency across pipelines. Foster a culture of technical excellence, experimentation, and continuous improvement within the data engineering team. Collaborate with platform, architecture, analytics, and governance teams to align platform enhancements with enterprise data strategy. Define and uphold SLOs, monitoring standards, and data quality KPIs for production pipelines and infrastructure. Partner with cross-functional teams to translate business needs into scalable, governed data products. Mentor engineers across the team, promoting knowledge sharing and adoption of modern engineering patterns and tools. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications 12 to 17 years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status.We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 1 week ago
12.0 - 17.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role Role Description: Let’s do this. Let’s change the world. We are looking for highly motivated expert Principal Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Architect and maintain robust, scalable data pipelines using Databricks, Spark, and Delta Lake, enabling efficient batch and real-time processing. Lead efforts to evaluate, adopt, and integrate emerging technologies and tools that enhance productivity, scalability, and data delivery capabilities. Drive performance optimization efforts, including Spark tuning, resource utilization, job scheduling, and query improvements. Identify and implement innovative solutions that streamline data ingestion, transformation, lineage tracking, and platform observability. Build frameworks for metadata-driven data engineering, enabling reusability and consistency across pipelines. Foster a culture of technical excellence, experimentation, and continuous improvement within the data engineering team. Collaborate with platform, architecture, analytics, and governance teams to align platform enhancements with enterprise data strategy. Define and uphold SLOs, monitoring standards, and data quality KPIs for production pipelines and infrastructure. Partner with cross-functional teams to translate business needs into scalable, governed data products. Mentor engineers across the team, promoting knowledge sharing and adoption of modern engineering patterns and tools. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications 12 to 17 years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status.We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 1 week ago
12.0 - 17.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Principal Data Engineer What You Will Do Let’s do this. Let’s change the world. Role Description: We are seeking a seasoned Principal Data Engineer to lead the design, development, and implementation of our data strategy. The ideal candidate possesses a deep understanding of data engineering principles, coupled with strong leadership and problem-solving skills. As a Principal Data Engineer, you will architect and oversee the development of robust data platforms, while mentoring and guiding a team of data engineers. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, demonstrating AWS or other preferred platforms. Lead and motivate a strong data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 12 to 17 years of experience in Computer Science, IT or related field of experience Demonstrated proficiency in leveraging cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop strong data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
12.0 - 17.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Senior Manager - Clinical Data Hub Team What You Will Do Let’s do this. Let’s change the world. In this vital role you will lead an Agile product squad and responsible for defining the vision & strategy and implementation for a range of Clinical Data products supporting Amgen Clinical Trial Design & Analytics. You will collaborate closely with statisticians, data scientists, data engineers, and AI/ ML engineers teams to understand business needs, identify system enhancements, and drive system implementation projects. Your extensive experience in business analysis, system design, and project management will enable you to deliver innovative and effective technology products. Roles & Responsibilities : Define and communicate the product feature vision, including both technical / architectural features and enablement, and end-user features, ensuring alignment with business objectives across multiple solution collaborator groups Create, prioritize, and maintain the feature backlog, ensuring that it reflects the needs of the business and collaborators Collaborate with collaborators to gather and document product requirements, user stories, and acceptance criteria Work closely with the business teams, Scrum Master and development team to plan and implement sprints, ensuring that the highest priority features are delivered Oversee the day-to-day management of technology platforms, ensuring that they meet performance, security, and availability requirements Ensure that platforms comply with security standards, regulatory requirements, and organizational policies Assure that AIN team is successfully creating robust written materials, including product documentation, product backlog and user stories, and creating other need artifacts to assure efficient and effective coordination across time zones. Oversee the resolution of service-related incidents and problems, ensuring minimal impact on business operations Maintains in-depth knowledge of clinical development business domains with an emphasis in data assets and data pipelines, as well as an understanding of the multi-functional dependencies. Analyze customer feedback and support data to identify pain points and opportunities for product improvement What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 12 to 17 years of experience in Computer Science, IT or related field of experience A solid foundation in modern software design and engineering practices and business analysis. Proven experience in undemanding and gather business requirements and delivered insight, and achieved concrete business outcome. Technical Proficiency: Good understanding of the following technologies: Python, R, AI/ML frameworks, relational databases/data modeling, AWS services ( EC2, S3, Lambda, ECS, IAM), Docker and CI/CD/Gitlab, Apache/Databricks, Expert understanding and experience of clinical development process within Life Sciences (global clinical trial data sources, SDTM & AdaM, end-to-end clinical data design and analysis pipeline, clinical data security and governance) Experience in Agile product development as a participating member of a scrum team and related ceremonies and processes Ability to collaborate with data scientists and data engineers to deliver functional business requirements as well defining product roadmap. High learning agility, demonstrated ability of quickly grasp ever changing technology and clinical development domain knowledge and applied to the project work. Strong communications skills in writing, speaking, presenting and time management skills. Preferred Qualifications: Training or education degree in Computer Science, Biology, or Chemistry. Experience with Clinical Data and CDISC (SDTM and ADaM) standard Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity, particularly about data patterns, and learning about business processes and “life of the user” Highest degree of initiative and self-motivation Strong verbal and written communication skills, including presentation of varied audiences through complex technical/business topics Confidence in leading teams through prioritization and sequencing discussions, including managing collaborator expectations Ability to work effectively with global, virtual teams, specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
3.0 years
0 Lacs
Delhi, Delhi
On-site
Job Description: Hadoop & ETL Developer Location: Shastri Park, Delhi Experience: 3+ years Education: B.E./ B.Tech/ MCA/ MSC (IT or CS) / MS Salary: Upto 80k (rest depends on interview and the experience) Notice Period: Immediate joiner to 20 days of joiners Candidates from Delhi/ NCR will only be preferred Job Summary:- We are looking for a Hadoop & ETL Developer with strong expertise in big data processing, ETL pipelines, and workflow automation. The ideal candidate will have hands-on experience in the Hadoop ecosystem, including HDFS, MapReduce, Hive, Spark, HBase, and PySpark, as well as expertise in real-time data streaming and workflow orchestration. This role requires proficiency in designing and optimizing large-scale data pipelines to support enterprise data processing needs. Key Responsibilities Design, develop, and optimize ETL pipelines leveraging Hadoop ecosystem technologies. Work extensively with HDFS, MapReduce, Hive, Sqoop, Spark, HBase, and PySpark for data processing and transformation. Implement real-time and batch data ingestion using Apache NiFi, Kafka, and Airbyte. Develop and manage workflow orchestration using Apache Airflow. Perform data integration across structured and unstructured data sources, including MongoDB and Hadoop-based storage. Optimize MapReduce and Spark jobs for performance, scalability, and efficiency. Ensure data quality, governance, and consistency across the pipeline. Collaborate with data engineering teams to build scalable and high-performance data solutions. Monitor, debug, and enhance big data workflows to improve reliability and efficiency. Required Skills & Experience : 3+ years of experience in Hadoop ecosystem (HDFS, MapReduce, Hive, Sqoop, Spark, HBase, PySpark). Strong expertise in ETL processes, data transformation, and data warehousing. Hands-on experience with Apache NiFi, Kafka, Airflow, and Airbyte. Proficiency in SQL and handling structured and unstructured data. Experience with NoSQL databases like MongoDB. Strong programming skills in Python or Scala for scripting and automation. Experience in optimizing Spark and MapReduce jobs for high-performance computing. Good understanding of data lake architectures and big data best practices. Preferred Qualifications Experience in real-time data streaming and processing. Familiarity with Docker/Kubernetes for deployment and orchestration. Strong analytical and problem-solving skills with the ability to debug and optimize data workflows. If you have a passion for big data, ETL, and large-scale data processing, we’d love to hear from you! Job Types: Full-time, Contractual / Temporary Pay: From ₹400,000.00 per year Work Location: In person
Posted 1 week ago
2.0 - 8.0 years
4 - 10 Lacs
Bengaluru
Work from Office
Visa s Technology Organization is a community of problem solvers and innovators reshaping the future of commerce. We operate the world s most sophisticated processing networks capable of handling more than 65k secure transactions a second across 80M merchants, 15k Financial Institutions, and billions of everyday people. While working with us you ll get to work on complex distributed systems and solve massive scale problems centered on new payment flows, business and data solutions, cyber security, and B2C platforms. The Opportunity: We are looking for Versatile, curious, and energetic Software Engineers who embrace solving complex challenges on a global scale. As a Visa Software Engineer, you will be an integral part of a multi-functional development team inventing, designing, building, and testing software products that reach a truly global customer base. While building components of powerful payment technology, you will get to see your efforts shaping the digital future of monetary transactions. The Work itself: Design code and systems that touch 40% of the world population while influencing Visa s internal standards for scalability, security, and reusability Collaborate multi-functionally to create design artifacts and develop best-in-class software solutions for multiple Visa technical offerings Actively contribute to product quality improvements, valuable service technology, and new business flows in diverse agile squads Develop robust and scalable products intended for a myriad of customers including end-user merchants, b2b, and business to government solutions. Leverage innovative technologies to build the next generation of Payment Services, Transaction Platforms, Real-Time Payments, and Buy Now Pay Later Technology Opportunities to make a difference on a global or local scale through mentorship and continued learning opportunities Essential Functions: Managing OpenSearch and Apache Druid Clusters. Write efficient and scalable code in Python and Java to support data engineering projects. Utilize SQL for data analytics and reporting tasks. Engineer and develop solutions using OpenSearch/Elastic including Kafka streaming. Collaborate with cross-functional teams to understand requirements and deliver solutions. Ensure data integrity, quality, and security throughout the data lifecycle. This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. Basic Qualifications: Bachelors degree AND 4+ years of relevant work experience Preferred Qualifications: Bachelor s degree in computer
Posted 1 week ago
5.0 - 6.0 years
9 - 10 Lacs
Bengaluru
Work from Office
Role Summary: The Sr. Site Reliability Engineer will manage multiple applications under the Data Marketing Platform (DME). These applications operate across diverse platforms including Java, .Net, and various backend databases. Application operates on Hybrid model Containers and VMs . The role is critical to ensuring the reliability, availability, and performance of these systems. Key Responsibilities: Handle client requests and issue tickets. Coordinate major releases and hotfixes across Production and DR sites. Manage all types of security remediation and exception handling. Share knowledge and cross-train team members to build a scalable resource pool. Automate repetitive manual tasks to improve accuracy and efficiency. Identify application gaps and collaborate with Dev and Product teams for long-term fixes. Communicate proactively with business, clients, and partners regarding releases and incidents. Implement monitoring solutions for early detection and proactive mitigation. Continuously monitor applications and respond swiftly to minimize impact. Provide timely issue notifications and lead restoration efforts. Contribute to TLT-level initiatives and pursue continuous learning. Ensure high availability and SLA compliance across the platform. Support PRE turnover by validating and refining onboarding checklists. Reduce alert noise through deep application knowledge and incident analysis. Track application growth and manage environment scaling. Engage with business and clients to plan changes, releases, and onboarding. In an expanded role, act as Associate System Analyst drive innovation, automation, and strategic decision-making. Some of the applications do have batch jobs scheduled either through Windows Scheduler or Control-M , need make sure that the Cyclic and Hourly jobs are working fine as expected. Basic Qualifications:- Bachelors degree AND 4+ years of relevant work experience API & Web: Java, SpringBoot, Apache, .Net, AngularJS, Python, Bash, PowerShell Big Data: Kafka, H
Posted 1 week ago
0.0 - 1.0 years
5 - 9 Lacs
Ghaziabad, New Delhi, Pune
Work from Office
About the Role: We are looking for a skilled professional on a contractual basis to clean and secure the hosting account of one of our clients. The website has been facing malware and virus-related issues, and we require a thorough cleanup, security hardening, and hosting organization. Responsibilities: Scan and remove all malware, viruses, and malicious code from the hosting account (multiple websites may be involved). Secure the hosting to prevent future attacks (firewall setup, security plugins, permission audits, etc.). Identify vulnerabilities and fix security loopholes. Ensure websites are functioning correctly post-cleanup. Organize hosting files and directories for better manageability. Recommend and implement best practices for ongoing hosting security. Provide a brief report of work done and precautions to be followed going forward. Requirements: Proven experience in malware removal and hosting security (especially with cPanel, WordPress, WHM or similar environments). Familiarity with tools like Wordfence, Sucuri, Imunify360, or similar. Ability to work independently and complete the task within the agreed timeline. Strong communication and documentation skills. Nice to Have: Experience with website migration, backups, and server hardening. Basic knowledge of DNS, SSL, and CDN configurations. Budget & Payment: To be discussed based on scope after initial review. Open to hourly or fixed-price proposals. Job Types: Full-time, Contractual / Temporary
Posted 1 week ago
10.0 - 15.0 years
7 - 11 Lacs
Chennai
Work from Office
Job Title: Databricks Infrastructure Engineer Location: Hyderabad/Bengaluru Job Summary: We are looking for a skilled Databricks Infrastructure Engineer to design, build, and manage the cloud infrastructure that supports Databricks development efforts. This role will focus on creating and maintaining scalable, secure, and automated infrastructure environments using Terraform and other Infrastructure-as-Code (IaC) tools. The infrastructure will enable data engineers and developers to efficiently create notebooks, pipelines, and ingest data following the Medallion architecture (Bronze, Silver, Gold layers). The ideal candidate will have strong cloud engineering skills, deep knowledge of Terraform, and hands-on experience with Databricks platform provisioning. Key Responsibilities: Infrastructure Design & Provisioning: Design and implement scalable and secure infrastructure environments to support Databricks workloads aligned with the Medallion architecture. Develop and maintain Infrastructure-as-Code (IaC) scripts and templates using Terraform and/or ARM templates for provisioning Databricks workspaces, clusters, storage accounts, networking, and related Azure resources. Automate the setup of data ingestion pipelines, storage layers (Bronze, Silver, Gold), and access controls necessary for smooth data operations. Platform Automation & Optimization: Create automated deployment pipelines integrated with CI/CD tools (e.g., Azure DevOps, Jenkins) to ensure repeatable and consistent infrastructure provisioning. Optimize infrastructure configurations to balance performance, scalability, security, and cost-effectiveness. Monitor infrastructure health and perform capacity planning to support evolving data workloads. Implement and maintain backup, recovery, and disaster recovery strategies for Databricks environments. Optimize performance of Databricks clusters, jobs, and SQL endpoints. Automate routine administration tasks using scripting and orchestration tools. Troubleshoot platform issues, identify root causes, and implement solutions. Security & Governance: Implement security best practices including network isolation, encryption, identity and access management (IAM), and role-based access control (RBAC) within the infrastructure. Collaborate with governance teams to embed compliance and audit requirements into infrastructure automation. Collaboration & Support: Work closely with data engineers, data scientists, and platform administrators to understand infrastructure requirements and deliver solutions that enable efficient data engineering workflows. Provide documentation and training on infrastructure setup, usage, and best practices. Troubleshoot infrastructure issues and coordinate with cloud and platform support teams for resolution. Stay up to date with Databricks features, releases, and best practices Required Qualifications: 10+ years of experience in Databricks and cloud infrastructure engineering, preferably with Azure Strong hands-on experience writing Infrastructure-as-Code using Terraform; experience with ARM templates or CloudFormation is a plus. Practical knowledge of provisioning and managing Databricks environments and associated cloud resources. Familiarity with Medallion architecture and data lake house concepts. Experience with CI/CD pipeline creation and automation tools such as Azure DevOps, Jenkins, or GitHub Actions. Solid understanding of cloud networking, storage, security, and identity management. Proficiency in scripting languages such as Python, Bash, or PowerShell. Strong collaboration and communication skills to work across cross-functional teams. Preferred Skills: Prior experience working with Databricks platform, including workspace and cluster management. Knowledge of data governance tools and practices. Experience with monitoring and logging tools (e.g., Azure Monitor, CloudWatch). Exposure to containerization and orchestration technologies such as Docker and Kubernetes. Understanding of data ingestion frameworks and pipeline orchestration tools like Apache Airflow or Azure Data Factory. Location: IND:AP:Hyderabad / Argus Bldg 4f & 5f, Sattva, Knowledge City- Adm: Argus Building, Sattva, Knowledge City Job ID R-67994-1 Date posted 07/25/2025
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Hiring for GCP Data Engineer - Chennai Key Responsibilities: Design and implement scalable and secure data pipelines using GCP-native tools such as: Cloud Dataflow , Cloud Dataproc , BigQuery , Cloud Composer Build and manage ETL/ELT workflows to ingest data from various sources (structured and unstructured) Optimize and monitor data pipelines for performance and cost Ensure data quality and integrity across the data lifecycle Collaborate with data scientists, analysts, and application teams for data access and transformation needs Implement CI/CD for data engineering pipelines using tools like Cloud Build or Jenkins Develop and maintain data models and schemas (dimensional or normalized) Document data flows, architecture, and metadata for internal usage Apply best practices in data security, access control , and compliance (e.g., GDPR, HIPAA) Required Skills & Qualifications: 3+ years of experience in data engineering, with at least 2+ years on GCP Strong knowledge and hands-on with GCP services: BigQuery , Cloud Storage , Pub/Sub , Dataflow , Dataproc , Cloud Functions Proficiency in SQL , Python , and Java or Scala (preferred) Experience with orchestration tools such as Apache Airflow or Cloud Composer Experience with real-time data streaming and batch data processing Good understanding of data warehousing , data lake architecture , and data governance Familiarity with Terraform , Infrastructure as Code (IaC) is a plus GCP certifications such as Professional Data Engineer or Associate Cloud Engineer are desirable
Posted 1 week ago
6.0 - 11.0 years
8 - 12 Lacs
Hyderabad
Work from Office
TBD Bachelor s degree in computer science or a related field, or equivalent practical experience 6+ years of proven experience in the software development industry, working in collaborative team environments 6+ years of experience using automation tools such as Selenium WebDriver with programming languages like Python, C#, or Java 5+ years of hands-on experience testing and automating web services, including RESTful APIs 3+ years of experience in performance testing using tools such as Apache JMeter 2+ years of experience in mobile web application testing automation using Appium Strong experience with object-oriented programming languages such as Java and C#/.NET Good to Have - Experience working with CI/CD technologies such as Bamboo, Bitbucket, Octopus Deploy, and Maven Working knowledge of API testing tools such as JavaScript, RestAssured Solid understanding of software engineering best practices across the full software development lifecycle, including coding standards, code reviews, source control, build processes, testing, and operations Experience or familiarity with AWS cloud services Demonstrate strong written and verbal communication skills Proven ability to learn new technologies and adapt in a dynamic environment Familiarity with Atlassian tools, including Jira and Confluence Working knowledge of Agile methodologies, particularly Scrum Experience operating in a Continuous Integration (CI) environment Provide leadership, mentorship, and guidance to business analysts and QA team members on manual and automated testing Collaborate with product owners and business analysts to ensure user stories are well-defined, testable, and include measurable acceptance criteria Identify edge cases, risks, and requirement gaps early in the planning process to strengthen story quality and test coverage Participate in Agile ceremonies and sprint planning to advocate for quality throughout the development lifecycle Define and execute test strategies, including manual test cases, automated scripts, and scenarios for web and API testing Develop and maintain automated test suites and ensure effective integration into the CI/CD pipeline Identify, track, and ensure timely resolution of defects, including root cause analysis and process improvement Continuously enhance QA processes, tools, and standards to improve efficiency and product quality Collaborate across QA, development, and product teams to align on quality goals, timelines, and delivery expectations Support User Acceptance Testing (UAT) and incorporate customer feedback to ensure a high-quality release Ensure the final product meets user expectations for functionality, performance, and usabili D
Posted 1 week ago
4.0 - 9.0 years
5 - 9 Lacs
Hyderabad
Work from Office
TBD Bachelor s degree in computer science or related field or equivalent experience 4+ years of proven experience in the software development industry, working in collaborative team environments 4+ years of experience using automation tools such as Selenium WebDriver with programming languages like Python/C#/ Java 3+ years of hands-on experience testing and automating web services, including RESTful APIs 2+ years of experience in performance testing using tools such as Apache JMeter Strong written and verbal communication skills Good to Have- Experience in CI/CD technologies such as Bamboo, Bitbucket, Octopus Deploy, and Maven Working knowledge of API testing tools such as RestAssured Knowledge of software engineering best practices across the full software development lifecycle, including coding standards, code reviews, source control, build processes, testing, and operations Experience or familiarity with AWS cloud services Familiarity with Atlassian tools, including Jira and Confluence Working knowledge of Agile methodologies, particularly Scrum Experience operating in a Continuous Integration (CI) environment Develop and implement software testing strategies, plans, and procedures. Define and execute test strategies, including manual test cases, automated scripts, and scenarios for web and API testing Identify, track, and ensure timely resolution of defects, including root cause analysis and process improvement Collaborate across development, and product teams to align on quality goals, timelines, and delivery expectations Participate in Agile ceremonies and sprint planning to advocate for quality throughout the development lifecycle Continuously enhance QA processes, tools, and standards to improve efficiency and product quality Identify edge cases, risks, and requirement gaps early in the planning process to strengthen story quality and test coverage Develop and maintain automated test suites to ensure consistent and reliable software quality
Posted 1 week ago
1.0 - 2.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Our systems and applications are exceedingly available and exceptionally performing. We are into the business, which touch everyone s pocket if card is being used for transaction. Those transactions are happening within fraction of seconds. In addition, within a second, thousands of transactions happen concurrently. Our service is not limited to one junction, we are in 200+ countries. What are responsibilities of Product / Site Reliability Engineer? The primary responsibility of a Site Reliability Engineer is to ensure that the environment is secure and safe. All security findings should be remediated within the required resolution date defined by governance. We do not allow outages, even for a second. If any issue arises, as the owner of the environment, we take the necessary steps to ensure those environments are up and running. Root cause analysis should be completed within hours. We ensure that findings are remediated in the Production environment after all tests and checks in lower environments. As the owner of the environment, we keep track of all activities planned or happening in our environments. We are responsible for deploying new code in the environment. We regularly monitor and analyze our environment. If there is a manual task, we automate it. We are increasing self-heal capabilities and will continue to do so until environments become auto-heal. If a new service is coming under our support or if the migration of an old environment is going to happen to new technologies, we start interacting with product developers to plan for production. As our business operates around the clock, we work in shifts and synchronize with multiple locations and multiple tracks (sub-teams). We ensure that every activity is recorded according to the incident or change management process. Technical and related run books need to be prepared and shared with the team. Basic Qualifications:- Bachelors degree with 1-2 years of relevant work experience Preferred Qualifications :- B.E/B.Tech in IT or Computer S
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role: Azure Data Engineer Location: Bangalore | Hyderabad | Noida | Gurgaon | Pune Experience: 5+ Years Notice Period: Immediate Joiners Only Job Descriptions: Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 5-8 years of experience in data engineering with a focus on Azure cloud technologies . Strong proficiency in Azure Databricks , Apache Spark, and PySpark. Hands-on experience with Azure Synapse Analytics , including dedicated SQL pools and serverless SQL. Proficiency in SQL, Python, and ETL/ELT processes. Experience with Azure Data Factory, Azure Data Lake, and Azure Blob Storage. Familiarity with data governance, security, and compliance in cloud environments. Excellent problem-solving and communication skills. Preferred Qualifications: Azure certifications such as Azure Data Engineer Associate (DP-203). Experience with Delta Lake, MLflow, and Power BI integration. Knowledge of DevOps practices and tools (e.g., Git, Azure DevOps). Experience in Agile/Scrum environments.
Posted 1 week ago
6.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Job Description What We Do At Goldman Sachs, we connect people, capital and ideas to help solve problems for our clients. We are a leading global financial services firm providing investment banking, securities and investment management services to a substantial and diversified client base that includes corporations, financial institutions, governments and individuals. Investment Banking Goldman Sachs Investment Banking (IB) works on some of the most complex financial challenges and transactions in the market today. Whether advising on a merger, providing financial solutions for an acquisition, or structuring an initial public offering, we handle projects that help clients at major milestones. We work with corporations, pension funds, financial sponsors, and governments and are team of strong analytical thinkers, who have a passion for producing out-of-the-box ideas. The Goldman Sachs Group, Inc. is a leading global investment banking, securities and investment management firm that provides a wide range of financial services to a substantial and diversified client base that includes corporations, financial institutions, governments, and individuals. Founded in 1869, the firm is headquartered in New York and maintains offices in all major financial centers around the world. Who We Look For You are a proven full stack engineer. Not only strong technically, you have shown that you can work effectively with product managers, designers and other engineering teams. You have a fierce sense of ownership, caring deeply about the quality of everything that you deliver into your clients’ hands. You love the challenge of engineering and are confident in your ability to bring clarity and direction to ambiguous problem spaces. You work well in a fast-paced environment while deeply invest in long term quality and efficiency. Basic Qualifications 6-8 years of hands-on development experience in Core Java (Java 11-21) or Python, and proficiency in backend technologies such as Databases (SQL/No-SQL), Elastic Search, MongoDB, Spring framework, REST, GraphQL, Hibernate, etc. Experience with front-end development with React, Redux, Vue, Typescript, and/or similar frameworks. Demonstrated experience operating in a fast-paced Agile/Scrum setup with a global/remote team. Knowledge of developing and deploying applications in public cloud (AWS, GCP or Azure) or K8S. Experience with implementing unit tests, integration tests, Test Driven Development. Strong development, analytical and problem-solving skills. Knowledge of prompt engineering, LLM, AI Agents, etc. Preferred Qualifications Excellent communication skills and experience working directly with business stakeholders. Data modeling, warehousing (Snowflake, AWS Glue/ EMR, Apache Spark ) and a strong understanding of data engineering practices. Technical, Team or Project leadership experience. Some experience using Infrastructure-As-Code tools (e.g. AWS CDK, Terraform, CloudFormation) Experience with reactive, event-based architectures. Goldman Sachs Engineering Culture At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Founded in 1869, we are a leading global investment banking, securities and investment management firm. Headquartered in New York, we maintain offices around the world. We believe who you are makes you better at what you do. We're committed to fostering and advancing diversity and inclusion in our own workplace and beyond by ensuring every individual within our firm has a number of opportunities to grow professionally and personally, from our training and development opportunities and firmwide networks to benefits, wellness and personal finance offerings and mindfulness programs. Learn more about our culture, benefits, and people at GS.com/careers. We’re committed to finding reasonable accommodations for candidates with special needs or disabilities during our recruiting process. Learn more: https://www.goldmansachs.com/careers/footer/disability-statement.html © The Goldman Sachs Group, Inc., 2025. All rights reserved. Goldman Sachs is an equal employment/affirmative action employer Female/Minority/Disability/Veteran/Sexual Orientation/Gender Identity
Posted 1 week ago
5.0 years
0 Lacs
Greater Bengaluru Area
On-site
What if the work you did every day could impact the lives of people you know? Or all of humanity? At Illumina, we are expanding access to genomic technology to realize health equity for billions of people around the world. Our efforts enable life-changing discoveries that are transforming human health through the early detection and diagnosis of diseases and new treatment options for patients. Working at Illumina means being part of something bigger than yourself. Every person, in every role, has the opportunity to make a difference. Surrounded by extraordinary people, inspiring leaders, and world changing projects, you will do more and become more than you ever thought possible. Position Summary We are seeking a highly skilled Senior Data Engineer Developer with 5+ years of experience to join our talented team in Bangalore. In this role, you will be responsible for designing, implementing, and optimizing data pipelines, ETL processes, and data integration solutions using Python, Spark, SQL, Snowflake, dbt, and other relevant technologies. Additionally, you will bring strong domain expertise in operations organizations, with a focus on supply chain and manufacturing functions. If you're a seasoned data engineer with a proven track record of delivering impactful data solutions in operations contexts, we want to hear from you. Responsibilities Lead the design, development, and optimization of data pipelines, ETL processes, and data integration solutions using Python, Spark, SQL, Snowflake, dbt, and other relevant technologies. Apply strong domain expertise in operations organizations, particularly in functions like supply chain and manufacturing, to understand data requirements and deliver tailored solutions. Utilize big data processing frameworks such as Apache Spark to process and analyze large volumes of operational data efficiently. Implement data transformations, aggregations, and business logic to support analytics, reporting, and operational decision-making. Leverage cloud-based data platforms such as Snowflake to store and manage structured and semi-structured operational data at scale. Utilize dbt (Data Build Tool) for data modeling, transformation, and documentation to ensure data consistency, quality, and integrity. Monitor and optimize data pipelines and ETL processes for performance, scalability, and reliability in operations contexts. Conduct data profiling, cleansing, and validation to ensure data quality and integrity across different operational data sets. Collaborate closely with cross-functional teams, including operations stakeholders, data scientists, and business analysts, to understand operational challenges and deliver actionable insights. Stay updated on emerging technologies and best practices in data engineering and operations management, contributing to continuous improvement and innovation within the organization. All listed requirements are deemed as essential functions to this position; however, business conditions may require reasonable accommodations for additional task and responsibilities. Preferred Experience/Education/Skills Bachelor's degree in Computer Science, Engineering, Operations Management, or related field. 5+ years of experience in data engineering, with proficiency in Python, Spark, SQL, Snowflake, dbt, and other relevant technologies. Strong domain expertise in operations organizations, particularly in functions like supply chain and manufacturing. Strong domain expertise in life sciences manufacturing equipment, with a deep understanding of industry-specific challenges, processes, and technologies. Experience with big data processing frameworks such as Apache Spark and cloud-based data platforms such as Snowflake. Hands-on experience with data modeling, ETL development, and data integration in operations contexts. Familiarity with dbt (Data Build Tool) for managing data transformation and modeling workflows. Familiarity with reporting and visualization tools like Tableau, Powerbi etc. Good understanding of advanced data engineering and data science practices and technologies like pypark, sagemaker, cloudera MLflow etc. Experience with SAP, SAP HANA and Teamcenter applications is a plus. Excellent problem-solving skills, analytical thinking, and attention to detail. Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams and operations stakeholders. Eagerness to learn and adapt to new technologies and tools in a fast-paced environment. We are a company deeply rooted in belonging, promoting an inclusive environment where employees feel valued and empowered to contribute to our mission. Built on a strong foundation, Illumina has always prioritized openness, collaboration, and seeking alternative perspectives to propel innovation in genomics. We are proud to confirm a zero-net gap in pay, regardless of gender, ethnicity, or race. We also have several Employee Resource Groups (ERG) that deliver career development experiences, increase cultural awareness, and offer opportunities to engage in social responsibility. We are proud to be an equal opportunity employer committed to providing employment opportunity regardless of sex, race, creed, color, gender, religion, marital status, domestic partner status, age, national origin or ancestry, physical or mental disability, medical condition, sexual orientation, pregnancy, military or veteran status, citizenship status, and genetic information. Illumina conducts background checks on applicants for whom a conditional offer of employment has been made. Qualified applicants with arrest or conviction records will be considered for employment in accordance with applicable local, state, and federal laws. Background check results may potentially result in the withdrawal of a conditional offer of employment. The background check process and any decisions made as a result shall be made in accordance with all applicable local, state, and federal laws. Illumina prohibits the use of generative artificial intelligence (AI) in the application and interview process. If you require accommodation to complete the application or interview process, please contact accommodations@illumina.com. To learn more, visit: https://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf. The position will be posted until a final candidate is selected or the requisition has a sufficient number of qualified applicants. This role is not eligible for visa sponsorship.
Posted 1 week ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About The Position The solution architect role involves designing and overseeing the implementation of home-grown and commercial-off-the-shelf IT solutions using Microsoft Azure and their integration with various systems and applications. The role involves collaborating with stakeholders, evaluating technology options, and guiding development teams to deliver scalable, secure and efficient solutions. The expectation for this role is 8+ years of relevant experience. Key Responsibilities Architectural Leadership: Serve as the Azure Solution Architect with specific sector knowledge, setting the direction for cloud architecture and ensuring alignment with the organization's technical strategy and O&G industry standards. Uphold industry best practices and standards specific to the O&G sector. Technology Roadmap Construct and continuously update an Azure-focused technology roadmap, aligning with the organization's long-term goals. Explore and identify cutting-edge Azure services and features that can propel technological advancement. Strategically plan and implement upgrades to bolster the organization's competitive position and enhance the scalability of Azure-based solutions. Solution Design Take the lead in design and architecture complex Azure solutions, with a strong focus on ensuring scalability, robust security, and cost-effectiveness and alignment with nuanced demands of the O&G industry. Stakeholder Engagement Work in tandem with various service lines, such as engineering divisions and business stakeholders, to align Azure architectural strategies with the core business objectives and ensure the designs are in sync with the business's forward direction. Possess the ability to effectively communicate Azure technical strategies to non-technical stakeholders, thereby facilitating their participation in informed decision-making. Mentorship And Guidance Offer Azure technical leadership and mentorship to solution squads. Cultivate an environment of innovation, continuous improvement, and technical prowess across the organization. Compliance And Best Practices Guarantee Azure solutions meet regulatory demands and O&G-specific standards, including those related to safety, environment, and operations. Risk Assessment Proactively identify and assess technical risks linked with Azure infrastructure and applications. Collaborate with multifaceted teams to formulate and implement measures to alleviate the detected risks. As a Solution Architect, it is crucial to pinpoint potential risks during the solution development phase and devise comprehensive risk mitigation plans for all solutions crafted. Industry Expertise Stay informed about emerging technologies, trends, and standards in the oil and gas industry. Evaluate the potential impact of new technologies and provide recommendations for adoption both in upcoming solution designs as well as enhancement of solution architectures. Vendor Management Engage with external vendors and technology associates to scrutinize third-party offerings compatible with Azure. Integrate these third-party solutions effortlessly, ensuring they complement and reinforce the broader Azure architectural strategy and the business objectives. Required Qualifications Master’s (or) Bachelor’s degree in Computer Science, Engineering, or Information Technology or a related field (or equivalent experience). 8+ years of prior experience as a solution architect, preferably in the oil and gas industry or 8+ years of prior experience as a software engineer or similar role. Extensive experience in the oil and gas sector, including knowledge of industry-specific challenges and opportunities. Technical Expertise: Strong experience should have an in-depth understanding of Azure services and architecture, including IaaS, PaaS, and SaaS solutions. They should be adept at designing and implementing complex cloud infrastructure, ensuring scalability, reliability, and security. Also, strong experience in multi-cloud environments like AWS, GCP and their application migrations and management in O&G specifically. Advanced Problem-Solving: They must possess strong analytical skills to troubleshoot and resolve high-level technical issues. This includes the ability to perform root cause analysis and implement long-term solutions to prevent recurrence. Strategic Planning: The architect should be capable of developing strategic plans for cloud adoption and migration, aligning with the organization's goals. They should be able to evaluate and recommend new technologies and approaches to drive continuous improvement. Communication Skills: Excellent communication skills are essential for translating technical concepts to non-technical stakeholders, facilitating clear and effective discussions between cross-functional teams, and presenting proposals and progress reports to senior management. Client Engagement: Capable to work closely with clients (internal clients) to understand their business requirements and constraints, ensuring that the cloud solutions designed meet their needs and expectations. Innovation: With extensive experience, they should be at the forefront of innovation, exploring new cloud technologies and methodologies, and integrating them into the organization's practices to gain competitive advantage. Leadership and Mentorship: As a seasoned leader, the Senior Solution Architect should set the technical direction and make pivotal decisions that define the organization's cloud strategy, while also serving as a mentor to uplift junior architects and engineers. They must lead by example, inspiring teams through complex initiatives and fostering professional growth by imparting knowledge, best practices, and constructive feedback to nurture the next generation of technical experts. Preferred Qualifications Must have Master’s or Bachelor’s degree in computer science engineering or information technology or Relevant field. Relevant certifications such as Microsoft Certified: Azure Solutions Architect Expert or similar. Microsoft AZ900 Certification & AZ 305 Certification TOGAF or ArchiMate or Zachman or equivalent architecture frameworks experience Experience in automation using Python, Gen AI, AI Ops, etc. Experience with data integration, data warehousing, and big data technologies. Experience with containerization and orchestration tools (e.g., any 2 of following: Docker, OpenShift, Kubernetes, ECS, GKE, AKS, EKS, Rancher, Apache Mesos, Nomad, Docker Swarm, Kubernetes). Understanding of the O&G sector's operational workflows, including the intricacies of exploration, extraction, refining, and distribution activities, to tailor cloud-based solutions that complement the industry's unique needs. Competence in tackling technical hurdles specific to the O&G domain, such as efficient asset management in isolated areas, processing extensive seismic datasets, and ensuring compliance with strict regulatory frameworks. Proficiency in leveraging Azure cloud technologies to enhance the O&G Industry's operational effectiveness, utilizing tools like IoT, advanced data analytics, and machine learning for better results Experience with CI/CD pipelines and automated testing frameworks (e.g. CircleCI, Jenkins, TeamCity, Travis CI, Bamboo, Bitbucket, etc.) Strong interpersonal skills with the ability to engage effectively with both technical and non-technical stakeholders. Chevron ENGINE supports global operations, supporting business requirements across the world. Accordingly, the work hours for employees will be aligned to support business requirements. The standard work week will be Monday to Friday. Working hours are 8:00am to 5:00pm or 1.30pm to 10.30pm. Chevron participates in E-Verify in certain locations as required by law.
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Company Overview Docusign brings agreements to life. Over 1.5 million customers and more than a billion people in over 180 countries use Docusign solutions to accelerate the process of doing business and simplify people’s lives. With intelligent agreement management, Docusign unleashes business-critical data that is trapped inside of documents. Until now, these were disconnected from business systems of record, costing businesses time, money, and opportunity. Using Docusign’s Intelligent Agreement Management platform, companies can create, commit, and manage agreements with solutions created by the #1 company in e-signature and contract lifecycle management (CLM). What you'll do As a Data Engineer in the IAM Data Lake Team, you'll develop cutting-edge infrastructure for big data analytics on public cloud platforms, supporting high-concurrency customer-facing products. You should have independent experience building data lakes and warehouses, preferably on Azure. This role requires a hands-on approach to solving complex problems, collaborating with cross-functional teams, and continuously improving engineering practices. You'll work closely with product managers, developers, and platform engineers to deliver high-quality, timely product releases, focusing on features, performance, security, and accessibility. This position is an individual contributor role reporting to the Director of Engineering. Responsibility Drive design, implementation, testing and release of products Build big data pipelines and analytics infrastructure on Azure with Data Factory, Databricks, Event Hub, Data Explorer, Cosmos DB and Azure RDBMS platforms Build secure networking and reliable infrastructure for High Availability and Disaster Recovery Build big data streaming solutions with 100s of concurrent publishers and subscribers Collaborate closely with Product, Design, and Engineering teams to build new features Participate in an Agile environment using Scrum software development practices, code review, automated unit testing, end-to-end testing, continuous integration, and deployment Think about how to solve problems at scale and build fault-tolerant systems that leverage telemetry and metrics Investigate, fix, and maintain code as needed for production issues Operate high reliability, high uptime service and participate in on-call rotation Job Designation Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation) Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a position's job designation depending on business needs and as permitted by local law. What you bring Basic BS degree in Computer Science, Engineering or equivalent 5+ years of experience within a software engineering related field 2+ years experience with Azure Data services , Azure Data Explorer, Azure Data Factory, Databricks, Event-hubs 2+ years experience with Data warehouse/ data modeling with RDBMS like SQL Server Proficiency in cloud platform deployment with Azure ARM templates or Terraform Experience using Git or other version control systems and CI-CD systems Focus on writing high quality code that is easy to be maintained by others Strong understanding and experience in agile methodologies Preferred Experience with Cosmos DB or No-SQL platforms Experience building large data lakes and data warehouses Proficiency in modern server- side development using modern programming languages like C# or others Building cloud apps on Azure Strong interest or documented experience in large scale microservice architectures on Kubernetes Proficiency in big data processing in Apache Spark with Python or Scala Proficiency in data streaming applications with Event Hub/Kafka Spark streaming Proficiency in data pipeline orchestration with Data Factory or similar A track record of being a self-starter - Individual/team responsibility is our main driver in the development work Life at Docusign Working here Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what’s right, every day. At Docusign, everything is equal. We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you’ll be loved by us, our customers, and the world in which we live. Accommodation Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need such an accommodation, or a religious accommodation, during the application process, please contact us at accommodations@docusign.com. If you experience any issues, concerns, or technical difficulties during the application process please get in touch with our Talent organization at taops@docusign.com for assistance. Applicant and Candidate Privacy Notice
Posted 1 week ago
7.0 years
0 Lacs
India
Remote
Who We Are At Twilio, we’re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences. Our dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day. As we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding. Your career at Twilio is in your hands. See Yourself at Segment Join us as our next Staff Data Engineer (L4) on the Segment Data Platform team. About The Job As a Staff Data Engineer, you will play a key role in building and maintaining data infrastructure that processes large-scale datasets efficiently and reliably. You’ll contribute to the design and implementation of high-volume pipelines, collaborate with engineers across teams, and help ensure our platform remains robust, scalable, and easy to use. This is a great role for someone with a strong data engineering background who’s ready to step into broader responsibilities and help shape the evolution of Segment’s platform. Responsibilities In this role, you will: Design and build the next generation of Warehouse Activation platform, process billions of events, and power various use cases for Twilio Segment customers. This encompasses working on stream data processing, storage, and other mission-critical systems. Ship features that opt for high availability and throughput with eventual consistency Collaborate with engineering and product leads, as well as teams across Twilio Segment Support the reliability and security of the platform Build and optimize globally available and highly scalable distributed systems Be able to act as a team Tech Lead as needed Mentor other engineers on the team in technical architecture and design Partner with application teams to deliver end to end customer success. Qualifications Twilio values diverse experiences from all kinds of industries, and we encourage everyone who meets the required qualifications to apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table! Required 7+ years of industry experience in backend or data engineering roles. Strong programming skills in Scala, Java, or a similar language. Solid experience with Apache Spark or other distributed data processing frameworks. Experience with Trino, Snowflake, Delta Lake and comfortable working with ecommerce-scale datasets Working knowledge of batch and stream processing architectures. Experience designing, building, and maintaining ETL/ELT pipelines in production. Familiarity with AWS and tools like Parquet, Delta Lake, or Kafka. Comfortable operating in a CI/CD environment with infrastructure-as-code and observability tools. Strong collaboration and communication skills. Nice To Have Familiarity with GDPR, CCPA, or other data governance requirements. Experience with high-scale event processing or identity resolution. Exposure to multi-region, fault-tolerant distributed systems Location : This role will be remote and based in India(Karnataka, Maharashtra, New Delhi, Tamil nadu and Telangana) Travel We prioritize connection and opportunities to build relationships with our customers and each other. For this role, you may be required to travel occasionally to participate in project or team in-person meetings. What We Offer Working at Twilio offers many benefits, including competitive pay, generous time off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location. Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn't what you're looking for, please consider other open positions. Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law.
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems Apply fundamental knowledge of programming languages for design specifications. Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging Serve as advisor or coach to new or lower level analysts Identify problems, analyze information, and make evaluative judgements to recommend and implement solutions Resolve issues by identifying and selecting solutions through the applications of acquired technical experience and guided by precedents Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 5+ years of Proven experience in developing and managing Big data solutions using Apache Spark, Scala is must. Having strong hold on Spark-core, Spark-SQL & Spark Streaming Strong programming skills in Scala, Java, or Python. Hands on experience on Technologies like Apache Hive, Apache Kafka, HBase, Couchbase, Sqoop, Flume etc. Proficiency in SQL and experience with relational (Oracle/PL-SQL) . Experience in working on Kafka, JMS / MQ applications. Experience in working multiple OS (Unix, Linux, Win) Familiarity with data warehousing concepts and ETL processes. Experience in performance tuning of large technical solutions with significant volumes Knowledge of data modeling, data architecture, and data integration techniques. Knowledge on best practices for data security, privacy, and compliance. Experience with JAVA (Core Java, J2EE, Spring Boot Restful Services), Web services (REST, SOAP), XML, Java Script, Micro services, SOA etc. Strong technical knowledge of Apache Spark, Hive, SQL, and Hadoop ecosystem. Experience with developing frameworks and utility services including logging/monitoring. Experience delivering high quality software following continuous delivery and using code quality tools (JIRA, GitHub, Jenkin, Sonar, etc.). Experience creating large-scale, multi-tiered, distributed applications with Hadoop and Spark Profound knowledge implementing to different data storage solutions such as RDMBS(Oracle), Hive, HBase, Impala and NO SQL databases. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39928 Jobs | Dublin
Wipro
19400 Jobs | Bengaluru
Accenture in India
15955 Jobs | Dublin 2
EY
15128 Jobs | London
Uplers
11280 Jobs | Ahmedabad
Amazon
10521 Jobs | Seattle,WA
Oracle
9339 Jobs | Redwood City
IBM
9274 Jobs | Armonk
Accenture services Pvt Ltd
7978 Jobs |
Capgemini
7754 Jobs | Paris,France