Jobs
Interviews

10949 Apache Jobs - Page 43

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Position Summary We are seeking an Apache Hadoop - Subject Matter Expert (SME) who will be responsible for designing, optimizing, and scaling Spark-based data processing systems. This role involves hands-on experience in Spark architecture and core functionalities, focusing on building resilient, high-performance distributed data systems. You will collaborate with engineering teams to deliver high-throughput Spark applications and solve complex data challenges in real-time processing, big data analytics, and streaming. If you’re passionate about working in fast-paced, dynamic environments and want to be part of the cutting edge of data solutions, this role is for you. We’re Looking For Someone Who Can Design and optimize distributed Spark-based applications, ensuring low-latency, high-throughput performance for big data workloads. Troubleshooting: Provide expert-level troubleshooting for any data or performance issues related to Spark jobs and clusters. Data Processing Expertise: Work extensively with large-scale data pipelines using Spark's core components (Spark SQL, DataFrames, RDDs, Datasets, and structured streaming). Performance Tuning: Conduct deep-dive performance analysis, debugging, and optimization of Spark jobs to reduce processing time and resource consumption. Cluster Management: Collaborate with DevOps and infrastructure teams to manage Spark clusters on platforms like Hadoop/YARN, Kubernetes, or cloud platforms (AWS EMR, GCP Dataproc, etc.). Real-time Data: Design and implement real-time data processing solutions using Apache Spark Streaming or Structured Streaming. This role requires flexibility to work in rotational shifts, based on team coverage needs and customer demand. Candidates should be comfortable supporting operations in a 24x7 environment and willing to adjust working hours accordingly. What Makes You The Right Fit For This Position Expert in Apache Spark: In-depth knowledge of Spark architecture, execution models, and the components (Spark Core, Spark SQL, Spark Streaming, etc.) Data Engineering Practices: Solid understanding of ETL pipelines, data partitioning, shuffling, and serialization techniques to optimize Spark jobs. Big Data Ecosystem: Knowledge of related big data technologies such as Hadoop, Hive, Kafka, HDFS, and YARN. Performance Tuning and Debugging: Demonstrated ability to tune Spark jobs, optimize query execution, and troubleshoot performance bottlenecks. Experience with Cloud Platforms: Hands-on experience in running Spark clusters on cloud platforms such as AWS, Azure, or GCP. Containerization & Orchestration: Experience with containerized Spark environments using Docker and Kubernetes is a plus. Good To Have Certification in Apache Spark or related big data technologies. Experience working with Acceldata's data observability platform or similar tools for monitoring Spark jobs. Demonstrated experience with scripting languages like Bash, PowerShell, and Python. Familiarity with concepts related to application, server, and network security management. Possession of certifications from leading Cloud providers (AWS, Azure, GCP), and expertise in Kubernetes would be significant advantages.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Summary: We are seeking a skilled and motivated System Programmer to join our IT Infrastructure team. This role is responsible for the installation, configuration, maintenance, and performance of critical enterprise systems including Linux servers , Apache HTTP Server , and Oracle WebLogic . The ideal candidate will have strong scripting abilities and experience with writing SQL queries to support operational and development teams. Key Responsibilities Install, configure, and maintain Linux operating systems, Apache HTTP Server, and Oracle WebLogic application servers in development, test, and production environments. Perform regular system patching and software updates to ensure platform security and stability. Develop and maintain automation scripts (e.g., Bash, Python, or similar) to streamline system management tasks. Write and optimize SQL queries to support reporting, troubleshooting, and system integration needs. Monitor system performance and implement tuning improvements to maximize availability and efficiency. Work closely with development, QA, and operations teams to support application deployments and troubleshoot system-related issues. Maintain accurate system documentation, including configurations, procedures, and troubleshooting guides. Participate in an on-call rotation and respond to incidents as required. Required Qualifications More than 2 years of experience Proven experience with Linux system administration (RHEL, CentOS, or equivalent). Hands-on experience with Apache HTTP Server and Oracle WebLogic. Proficiency in scripting languages such as Bash, Python, or Perl. Strong understanding of SQL and relational databases (e.g., Oracle, MySQL). Familiarity with system monitoring tools and performance tuning. Knowledge of security best practices and patch management procedures. Excellent troubleshooting, analytical, and problem-solving skills. Strong communication skills and ability to work in a collaborative team environment. Preferred Qualifications Experience with CI/CD pipelines, Ansible, ArgoCD, or other automation tools. Exposure to cloud environments (e.g., AWS, Azure) or container technologies (e.g., Docker, Kubernetes). Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Persistent We are an AI-led, platform-driven Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world, including 12 of the 30 most innovative global companies, 60% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our disruptor’s mindset, commitment to client success, and agility to thrive in the dynamic environment have enabled us to sustain our growth momentum by reporting $1,409.1M revenue in FY25, delivering 18.8% Y-o-Y growth. Our 23,900+ global team members, located in 19 countries, have been instrumental in helping the market leaders transform their industries. We are also pleased to share that Persistent won in four categories at the prestigious 2024 ISG Star of Excellence™ Awards , including the Overall Award based on the voice of the customer. We were included in the Dow Jones Sustainability World Index, setting high standards in sustainability and corporate responsibility. We were awarded for our state-of-the-art learning and development initiatives at the 16 th TISS LeapVault CLO Awards. In addition, we were cited as the fastest-growing IT services brand in the 2024 Brand Finance India 100 Report. Throughout our market-leading growth, we’ve maintained a strong employee satisfaction score of 8.2/10. At Persistent, we embrace diversity to unlock everyone's potential. Our programs empower our workforce by harnessing varied backgrounds for creative, innovative problem-solving. Our inclusive environment fosters belonging, encouraging employees to unleash their full potential. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. Inclusive Environment We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent - persistent.com/careers

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Persistent We are an AI-led, platform-driven Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world, including 12 of the 30 most innovative global companies, 60% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our disruptor’s mindset, commitment to client success, and agility to thrive in the dynamic environment have enabled us to sustain our growth momentum by reporting $1,409.1M revenue in FY25, delivering 18.8% Y-o-Y growth. Our 23,900+ global team members, located in 19 countries, have been instrumental in helping the market leaders transform their industries. We are also pleased to share that Persistent won in four categories at the prestigious 2024 ISG Star of Excellence™ Awards , including the Overall Award based on the voice of the customer. We were included in the Dow Jones Sustainability World Index, setting high standards in sustainability and corporate responsibility. We were awarded for our state-of-the-art learning and development initiatives at the 16 th TISS LeapVault CLO Awards. In addition, we were cited as the fastest-growing IT services brand in the 2024 Brand Finance India 100 Report. Throughout our market-leading growth, we’ve maintained a strong employee satisfaction score of 8.2/10. At Persistent, we embrace diversity to unlock everyone's potential. Our programs empower our workforce by harnessing varied backgrounds for creative, innovative problem-solving. Our inclusive environment fosters belonging, encouraging employees to unleash their full potential. For more details please login to www.persistent.com About The Position We are looking for an experienced Full Stack Lead Engineer to join our development team. In this role, you will be responsible for the overall development and implementation of front- and back-end software applications. Your responsibilities will extend from designing system architecture to high-level programming, performance testing, and systems integration. To ensure success as a full stack engineer, you should have advanced programming skills, experience with application development, and excellent troubleshooting skills. As a top-rated full stack engineer, you should be able to create and implement advanced software systems that perfectly meet the needs of the company. What You?ll Do Work with development teams and product managers to ideate software solutions Design client-side and server-side architecture Build the front-end of applications through appealing visual design Develop and manage well-functioning databases and applications Write effective APIs Test software to ensure responsiveness and efficiency Troubleshoot, debug and upgrade software Create security and data protection settings Build features and applications with a mobile responsive design Write technical documentation Work with data scientists and analysts to improve software Expertise You?ll Bring Experience in the following fields: Front-end technologies, including JavaScript, CSS3 and HTML5 and third-party libraries such as React Js, Angular, jQuery and LESS Development languages: Server-side programming languages including Net, Java, Ruby and Python Servers, especially working with Nginx or Apache servers with a solid background in Linux Visualizing a proposed system and the ability to build it Experience working in MS-Cloud , Amazon-Cloud or Google Cloud will be added advantage Knowledge: Database and cache: DBMS technology, including SQLServer, Oracle, MongoDB and MySQL and caching mechanisms such as Redis, Memcached and Varnish Basic design ability including UI / UX and basic prototype design Excellent writing and communication skills Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. Inclusive Environment We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. Inclusive Environment We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent - persistent.com/careers

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Python Team Lead – Chennai Experience: 5–10 yrs | Type: Full time, On site We’re hiring a Python Team Lead to architect and deliver legacy and upcoming Python projects (Vesuvius, JKTIL, YOHT, Rane, etc.). What you’ll do: * Lead development using Python/Django, frontend (ReactJS/HTML/CSS/JS), MongoDB, server setup (IIS/Apache), and ELK stack (ElasticSearch, Logstash, Kibana) * Mentor developers, conduct code reviews, and implement best practices * Interface with customers, gather requirements, maintain docs, and provide site support You have: * B.Tech/M.Sc. in CS/IT or equivalent * 5–10 yrs Python & Django experience, including leadership * Proficiency in ReactJS, MongoDB, ELK, and server config * Manufacturing domain expertise preferred 📧 To apply: Send your CV and brief cover letter to recruiter.chennai@cii-mcc.in or WhatsApp on 78450 28926

Posted 1 week ago

Apply

0 years

10 Lacs

Thiruvananthapuram

On-site

Overview: Would you like to help enrich the lives of learners around the world? RM India (RM Education Solutions India Private Limited) is the India Delivery Center for UK based RM Plc. A leading supplier of technology and resources to the education sector, RM India helps deliver great education products and services that help teachers to teach and learners to learn. Our mission is to achieve growth by improving life chances of people. At RM India, we are driven by the potential of our business to touch lives and shape the future. RM Plc have been pioneers of education technology since 1973. We provide technology and resources to the education sector supporting over 10 million students around the world. We work with 28,000 schools, nurseries, and education trusts in 115 countries to deliver customer-centric solutions that improve education outcomes worldwide. RM is a leading supplier of technology and resources to the education sector, supporting schools, teachers, and pupils across the globe. What we do helps learners at all stages of their lives, from preschool to higher education and professional qualification, we partner with schools, examination boards, central governments and other professional institutions, to enrich the lives of learners. RM Group operates through three businesses: Technology (Managed Services, Software and Infrastructure for Schools), Assessment (Software and Services) and TTS (Education Resources). Visit us here to find out more: www.rmindia.co.in Responsibilities: Application Support (Must have) Extensive knowledge in troubleshooting web applications hosted in IIS or Apache. Should be able to replicate the issues raised by customers with available information. Deep dive into the issue to find RCA in given SLAs. Troubleshot both functional issues as well as performance issues in the applications. Proactively analyze the events logs and prevent any potential issues from happening. Database - MS SQL / Postgre SQL(Must have) Expert knowledge in writing complex sql queries in ms sql server or postgresql. Should be able to troubleshoot complex stored procedures, functions etc. Troubleshoot performance issues in DB server. etc. Create custom sql queries to work around issues, bulk update data, purge data etc. Monitoring – Azure Monitor, Cloud watch, Grafana, Ops genie (Must have) Should be acknowledging alerts triggered from various monitoring solutions and resolve them. Knowledge in creating or optimizing alerts is good to have. Also analyze logs from Azure Application Insights or tools like sumologic. Ticketing tools – ServiceNow / Jira (Must have) Experience in ticket management. Create, update and triage tickets. Maintain ticket SLAs. Cloud – Azure / AWS (Desired) Hands on experience is maintaining/troubleshooting azure/aws services. Windows/Linux VM basic level administration such as upscale/downscale, start/stop, ssh, troubleshoot logs, check disk spaces etc. Basic administration of Azure SQL or Postgres RDS clusters, performance monitoring, troubleshooting. Maintaining secrets. Storage account/S3 management activities. Basics of IAM administration. Troubleshoot issues of applications hosted in AKS/ECS clusters. Service bus queue troubleshooting. Deployment – Azure Devops / Gitlab (Good to have ) Deploying applications using existing deployment pipelines. Troubleshoot deployment failures. Scripting – Power shell / Shell (Good to have) Knowledge in writing scripts to automate tasks, setup workarounds. Experience: Experience:- 2+ yrs Mandatory skillset: - Application Support,Azure cloud, SQL/PostgreSQL,Infra maintanence, Azure/Aws, L3 support

Posted 1 week ago

Apply

0 years

3 - 12 Lacs

Cochin

Remote

We are looking for a PHP Developer with a strong backend foundation and familiarity with UI and DevOps workflows. In this hybrid role, you’ll build scalable web applications and contribute across the full development lifecycle from requirements to deployment. You will be: Developing and maintaining PHP 8 applications using clean, object-oriented code Designing and implementing business logic, APIs, and database interactions Contributing to sprint planning, estimations, and code reviews Collaborating with UI/UX and DevOps teams to ensure smooth delivery Owning the end-to-end development of custom web projects You have in-depth experience in: PHP frameworks like Laravel, Symfony or CodeIgniter. RDBMS systems such as MySQL and PostgreSQL. HTML, CSS, JavaScript for basic frontend collaboration. Version control using Git and containerization via Docker. You add value with exposure to: Cloud platforms: AWS, Azure, Google Cloud. CI/CD tools: Bitbucket Pipelines, AWS CodePipeline, Jenkins. Testing tools: PHPUnit, PEST. Search technologies: ElasticSearch, Algolia, Apache Solr. Frontend Frameworks: Angular, React, Vue. Basic scripting (Bash or Python) for task automation. Why choose LiteBreeze: Complex customized team projects and the opportunity to lead them! Work on projects from North European clients Excellent, clear career growth opportunities Opportunity to implement new ideas and technologies Free technical certifications like AWS. Opportunity to learn other backend technologies like Go and Node.js Great place to work certified -– three years in a row. Join us to work on cutting-edge, customized web projects for North European clients with clear growth paths and opportunities to expand your technical skills. Job Type: Full-time Pay: ₹25,000.00 - ₹100,000.00 per month Benefits: Flexible schedule Health insurance Work from home Work Location: In person Application Deadline: 10/08/2025

Posted 1 week ago

Apply

4.0 years

3 - 8 Lacs

Hyderābād

On-site

TBD Develop and implement software testing strategies, plans, and procedures. Define and execute test strategies, including manual test cases, automated scripts, and scenarios for web and API testing Identify, track, and ensure timely resolution of defects, including root cause analysis and process improvement Collaborate across development, and product teams to align on quality goals, timelines, and delivery expectations Participate in Agile ceremonies and sprint planning to advocate for quality throughout the development lifecycle Continuously enhance QA processes, tools, and standards to improve efficiency and product quality Identify edge cases, risks, and requirement gaps early in the planning process to strengthen story quality and test coverage Develop and maintain automated test suites to ensure consistent and reliable software quality Bachelor’s degree in computer science or related field or equivalent experience 4+ years of proven experience in the software development industry, working in collaborative team environments 4+ years of experience using automation tools such as Selenium WebDriver with programming languages like Python/C#/ Java 3+ years of hands-on experience testing and automating web services, including RESTful APIs 2+ years of experience in performance testing using tools such as Apache JMeter Strong written and verbal communication skills Good to Have- Experience in CI/CD technologies such as Bamboo, Bitbucket, Octopus Deploy, and Maven Working knowledge of API testing tools such as RestAssured Knowledge of software engineering best practices across the full software development lifecycle, including coding standards, code reviews, source control, build processes, testing, and operations Experience or familiarity with AWS cloud services Familiarity with Atlassian tools, including Jira and Confluence Working knowledge of Agile methodologies, particularly Scrum Experience operating in a Continuous Integration (CI) environment For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions – offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. https://www.verisk.com/company/careers/ Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Verisk Employee Privacy Notice

Posted 1 week ago

Apply

12.0 - 17.0 years

4 - 8 Lacs

Hyderābād

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Senior Manager - Clinical Data Hub Team What you will do Let’s do this. Let’s change the world. In this vital role you will lead an Agile product squad and responsible for defining the vision & strategy and implementation for a range of Clinical Data products supporting Amgen Clinical Trial Design & Analytics. You will collaborate closely with statisticians, data scientists, data engineers, and AI/ ML engineers teams to understand business needs, identify system enhancements, and drive system implementation projects. Your extensive experience in business analysis, system design, and project management will enable you to deliver innovative and effective technology products. Roles & Responsibilities : Define and communicate the product feature vision, including both technical / architectural features and enablement, and end-user features, ensuring alignment with business objectives across multiple solution collaborator groups Create, prioritize, and maintain the feature backlog, ensuring that it reflects the needs of the business and collaborators Collaborate with collaborators to gather and document product requirements, user stories, and acceptance criteria Work closely with the business teams, Scrum Master and development team to plan and implement sprints, ensuring that the highest priority features are delivered Oversee the day-to-day management of technology platforms, ensuring that they meet performance, security, and availability requirements Ensure that platforms comply with security standards, regulatory requirements, and organizational policies Assure that AIN team is successfully creating robust written materials, including product documentation, product backlog and user stories, and creating other need artifacts to assure efficient and effective coordination across time zones. Oversee the resolution of service-related incidents and problems, ensuring minimal impact on business operations Maintains in-depth knowledge of clinical development business domains with an emphasis in data assets and data pipelines, as well as an understanding of the multi-functional dependencies. Analyze customer feedback and support data to identify pain points and opportunities for product improvement What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 12 to 17 years of experience in Computer Science, IT or related field of experience A solid foundation in modern software design and engineering practices and business analysis. Proven experience in undemanding and gather business requirements and delivered insight, and achieved concrete business outcome. Technical Proficiency: Good understanding of the following technologies: Python, R, AI/ML frameworks, relational databases/data modeling, AWS services ( EC2, S3, Lambda, ECS, IAM), Docker and CI/CD/Gitlab, Apache/Databricks, Expert understanding and experience of clinical development process within Life Sciences (global clinical trial data sources, SDTM & AdaM, end-to-end clinical data design and analysis pipeline, clinical data security and governance) Experience in Agile product development as a participating member of a scrum team and related ceremonies and processes Ability to collaborate with data scientists and data engineers to deliver functional business requirements as well defining product roadmap. High learning agility, demonstrated ability of quickly grasp ever changing technology and clinical development domain knowledge and applied to the project work. Strong communications skills in writing, speaking, presenting and time management skills. Preferred Qualifications: Training or education degree in Computer Science, Biology, or Chemistry. Experience with Clinical Data and CDISC (SDTM and ADaM) standard Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity, particularly about data patterns, and learning about business processes and “life of the user” Highest degree of initiative and self-motivation Strong verbal and written communication skills, including presentation of varied audiences through complex technical/business topics Confidence in leading teams through prioritization and sequencing discussions, including managing collaborator expectations Ability to work effectively with global, virtual teams, specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

6.0 years

3 - 8 Lacs

Hyderābād

On-site

TBD Provide leadership, mentorship, and guidance to business analysts and QA team members on manual and automated testing Collaborate with product owners and business analysts to ensure user stories are well-defined, testable, and include measurable acceptance criteria Identify edge cases, risks, and requirement gaps early in the planning process to strengthen story quality and test coverage Participate in Agile ceremonies and sprint planning to advocate for quality throughout the development lifecycle Define and execute test strategies, including manual test cases, automated scripts, and scenarios for web and API testing Develop and maintain automated test suites and ensure effective integration into the CI/CD pipeline Identify, track, and ensure timely resolution of defects, including root cause analysis and process improvement Continuously enhance QA processes, tools, and standards to improve efficiency and product quality Collaborate across QA, development, and product teams to align on quality goals, timelines, and delivery expectations Support User Acceptance Testing (UAT) and incorporate customer feedback to ensure a high-quality release Ensure the final product meets user expectations for functionality, performance, and usabili D Bachelor’s degree in computer science or a related field, or equivalent practical experience 6+ years of proven experience in the software development industry, working in collaborative team environments 6+ years of experience using automation tools such as Selenium WebDriver with programming languages like Python, C#, or Java 5+ years of hands-on experience testing and automating web services, including RESTful APIs 3+ years of experience in performance testing using tools such as Apache JMeter 2+ years of experience in mobile web application testing automation using Appium Strong experience with object-oriented programming languages such as Java and C#/.NET Good to Have - Experience working with CI/CD technologies such as Bamboo, Bitbucket, Octopus Deploy, and Maven Working knowledge of API testing tools such as JavaScript, RestAssured Solid understanding of software engineering best practices across the full software development lifecycle, including coding standards, code reviews, source control, build processes, testing, and operations Experience or familiarity with AWS cloud services Demonstrate strong written and verbal communication skills Proven ability to learn new technologies and adapt in a dynamic environment Familiarity with Atlassian tools, including Jira and Confluence Working knowledge of Agile methodologies, particularly Scrum Experience operating in a Continuous Integration (CI) environment For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions – offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. https://www.verisk.com/company/careers/ Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Verisk Employee Privacy Notice

Posted 1 week ago

Apply

4.0 years

3 - 6 Lacs

Hyderābād

On-site

CDP ETL & Database Engineer The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture. The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities: ETL Development – The CDP ETL C Database Engineer will be responsible for building pipelines to feed downstream data They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. Implementations s Onboarding – Will work with the team to onboard new clients onto the ZMP/CDP+ The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests – The CDP ETL C Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management – The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration s Process Improvement – The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements: The CDP ETL C Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management When required, collaborate with the Business Solutions Analyst (BSA) to solidify. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow , and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives. Required Skills: ETL – ETL tools such as Talend (Preferred, not required) DMExpress – Nice to have Informatica – Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL – Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages – Can demonstrate knowledge of any of the PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS – Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value Working knowledge of Code Repositories such as GIT, Win CVS, Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira. Minimum Qualifications: Bachelor's degree or equivalent 4+ Years' experience Excellent verbal C written communications skills Self-Starter, highly motivated Analytical mindset Company Summary: Zeta Global is a NYSE listed data-powered marketing technology company with a heritage of innovation and industry leadership. Founded in 2007 by entrepreneur David A. Steinberg and John Sculley, former CEO of Apple Inc and Pepsi-Cola, the Company combines the industry's 3rd largest proprietary data set (2.4B+ identities) with Artificial Intelligence to unlock consumer intent, personalize experiences and help our clients drive business growth. Our technology runs on the Zeta Marketing Platform, which powers 'end to end' marketing programs for some of the world's leading brands. With expertise encompassing all digital marketing channels – Email, Display, Social, Search and Mobile – Zeta orchestrates acquisition and engagement programs that deliver results that are scalable, repeatable and sustainable. Zeta Global is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, gender, ancestry, color, religion, sex, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law. Zeta Global Recognized in Enterprise Marketing Software and Cross-Channel Campaign Management Reports by Independent Research Firm https://www.forbes.com/sites/shelleykohan/2024/06/1G/amazon-partners-with-zeta-global-to-deliver- gen-ai-marketing-automation/ https://www.cnbc.com/video/2024/05/06/zeta-global-ceo-david-steinberg-talks-ai-in-focus-at-milken- conference.html https://www.businesswire.com/news/home/20240G04622808/en/Zeta-Increases-3Q%E2%80%GG24- Guidance https://www.prnewswire.com/news-releases/zeta-global-opens-ai-data-labs-in-san-francisco-and-nyc- 300S45353.html https://www.prnewswire.com/news-releases/zeta-global-recognized-in-enterprise-marketing-software-and- cross-channel-campaign-management-reports-by-independent-research-firm-300S38241.html

Posted 1 week ago

Apply

12.0 - 17.0 years

7 - 8 Lacs

Hyderābād

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Principal Data Engineer What you will do Let’s do this. Let’s change the world. Role Description: We are seeking a seasoned Principal Data Engineer to lead the design, development, and implementation of our data strategy. The ideal candidate possesses a deep understanding of data engineering principles, coupled with strong leadership and problem-solving skills. As a Principal Data Engineer, you will architect and oversee the development of robust data platforms, while mentoring and guiding a team of data engineers. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, demonstrating AWS or other preferred platforms. Lead and motivate a strong data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 12 to 17 years of experience in Computer Science, IT or related field of experience Demonstrated proficiency in leveraging cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop strong data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Apache Kafka Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Support Engineer, you will act as a software detective, providing a dynamic service that identifies and resolves issues within various components of critical business systems. Your typical day will involve collaborating with team members to troubleshoot problems, analyzing system performance, and ensuring the smooth operation of applications that are vital to business functions. You will engage with stakeholders to understand their needs and provide timely solutions, contributing to the overall efficiency and reliability of the systems in place. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the development and implementation of best practices for application support. - Monitor system performance and proactively identify areas for improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Kafka. - Strong understanding of distributed systems and messaging queues. - Experience with troubleshooting and resolving application issues. - Familiarity with monitoring tools and performance tuning. - Ability to work collaboratively in a team environment. Additional Information: - The candidate should have minimum 3 years of experience in Apache Kafka. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education

Posted 1 week ago

Apply

8.0 years

30 - 38 Lacs

Gurgaon

Remote

Role: AWS Data Engineer Location: Gurugram Mode: Hybrid Type: Permanent Job Description: We are seeking a talented and motivated Data Engineer with requisite years of hands-on experience to join our growing data team. The ideal candidate will have experience working with large datasets, building data pipelines, and utilizing AWS public cloud services to support the design, development, and maintenance of scalable data architectures. This is an excellent opportunity for individuals who are passionate about data engineering and cloud technologies and want to make an impact in a dynamic and innovative environment. Key Responsibilities: Data Pipeline Development: Design, develop, and optimize end-to-end data pipelines for extracting, transforming, and loading (ETL) large volumes of data from diverse sources into data warehouses or lakes. Cloud Infrastructure Management: Implement and manage data processing and storage solutions in AWS (Amazon Web Services) using services like S3, Redshift, Lambda, Glue, Kinesis, and others. Data Modeling: Collaborate with data scientists, analysts, and business stakeholders to define data requirements and design optimal data models for reporting and analysis. Performance Tuning & Optimization: Identify bottlenecks and optimize query performance, pipeline processes, and cloud resources to ensure cost-effective and scalable data workflows. Automation & Scripting: Develop automated data workflows and scripts to improve operational efficiency using Python, SQL, or other scripting languages. Collaboration & Documentation: Work closely with data analysts, data scientists, and other engineering teams to ensure data availability, integrity, and quality. Document processes, architectures, and solutions clearly. Data Quality & Governance: Ensure the accuracy, consistency, and completeness of data. Implement and maintain data governance policies to ensure compliance and security standards are met. Troubleshooting & Support: Provide ongoing support for data pipelines and troubleshoot issues related to data integration, performance, and system reliability. Qualifications: Essential Skills: Experience: 8+ years of professional experience as a Data Engineer, with a strong background in building and optimizing data pipelines and working with large-scale datasets. AWS Experience: Hands-on experience with AWS cloud services, particularly S3, Lambda, Glue, Redshift, RDS, and EC2. ETL Processes: Strong understanding of ETL concepts, tools, and frameworks. Experience with data integration, cleansing, and transformation. Programming Languages: Proficiency in Python, SQL, and other scripting languages (e.g., Bash, Scala, Java). Data Warehousing: Experience with relational and non-relational databases, including data warehousing solutions like AWS Redshift, Snowflake, or similar platforms. Data Modeling: Experience in designing data models, schema design, and data architecture for analytical systems. Version Control & CI/CD: Familiarity with version control tools (e.g., Git) and CI/CD pipelines. Problem-Solving: Strong troubleshooting skills, with an ability to optimize performance and resolve technical issues across the data pipeline. Desirable Skills: Big Data Technologies: Experience with Hadoop, Spark, or other big data technologies. Containerization & Orchestration: Knowledge of Docker, Kubernetes, or similar containerization/orchestration technologies. Data Security: Experience implementing security best practices in the cloud and managing data privacy requirements. Data Streaming: Familiarity with data streaming technologies such as AWS Kinesis or Apache Kafka. Business Intelligence Tools: Experience with BI tools (Tableau, Quicksight) for visualization and reporting. Agile Methodology: Familiarity with Agile development practices and tools (Jira, Trello, etc.) Job Type: Permanent Pay: ₹3,000,000.00 - ₹3,800,000.00 per year Benefits: Work from home Schedule: Day shift Monday to Friday Experience: Data Engineering: 6 years (Required) AWS Elastic MapReduce (EMR): 3 years (Required) AWS: 4 years (Required) Work Location: In person

Posted 1 week ago

Apply

3.0 years

6 - 8 Lacs

Gurgaon

Remote

Job description About this role BlackRock Company Overview: BlackRock is a global leader in investment management, risk management, and advisory services for institutional and retail clients. We help clients achieve their goals and overcome challenges with a range of products, including separate accounts, mutual funds, iShares® (exchange-traded funds), and other pooled investment vehicles. We also offer risk management, advisory, and enterprise investment system services to a broad base of institutional investors through BlackRock Solutions®. Headquartered in New York City, as of February 5, 2025, we handle approximately $11.5 trillion in assets under management (AUM) and have around 19,000 employees in offices across 38 countries, with a significant presence in key global markets, including North and South America, Europe, Asia, Australia, the Middle East, and Africa. Aladdin Data: When BlackRock was founded in 1988, the goal was to combine financial services with innovative technology. Today, BlackRock is a leading FinTech platform for investment management and technology services globally. Data is central to the Aladdin platform, differentiating us through our ability to consume, store, analyze, and gain insights from it. The Aladdin Data team maintains a pioneering data platform that delivers high[1]quality data to users, including investors, operations staff, data scientists, and engineers. Our aim is to provide consistent, high-quality data while evolving our platform to support the firm's growth. We build high-performance data pipelines, enable data discovery and consumption, and continually enhance our data storage capabilities. Studio Self-service Front-end Engineering: Our team develops full-stack web applications for vendor data self-service, client data configuration, pipelines, and workflows. We support over a thousand internal users and hundreds of clients. We manage the data toolkit, including client-facing data requests, modeling, configuration management, ETL tools, CRUD applications, customized workflows, and back-end APIs to deliver exceptional client and user experiences with intuitive tools and excellent UX. Job Description and Responsibilities: Design, build, and maintain various front-end and corresponding back-end platform components, working with Product and Program Managers. Implement new user interfaces and business functionalities to meet evolving business and customer requirements, working with end users, with clear and concise documentation. Analyze and improve the performance of applications and related operational workflows to improve efficiency and throughput. Diagnose, research, and resolve software defects. Ensure software stability through documentation, code reviews, regression, unit, and user acceptance testing for smooth production operations. Lead all aspects of level 2 & 3 application support, ensuring smooth operation of existing processes and meeting new business opportunities. Be a self-starter and work with minimal direction in a globally distributed team. Role Essentials: A passion for engineering highly available, performant full-stack applications with a "Student of Markets and Technology" attitude. Bachelor's or master's degree or equivalent experience in computer science or engineering. 3+ years of professional experience working in teams. Experience in full-stack user-facing application development using web technologies (Angular, React, JavaScript) and Java-based REST API (Spring framework). Experience in testing frameworks such as Protractor, TestCafe, Jest. Knowledge in relational database development and at least one NoSQL Database (e.g., Apache Cassandra, MongoDB, etc.). Knowledge of software development methodologies (analysis, design, development, testing) and a basic understanding of Agile/Scrum methodology and practices. Our benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Job Requisition # R253535

Posted 1 week ago

Apply

5.0 years

19 - 20 Lacs

Chennai, Tamil Nadu, India

On-site

Position Title: Senior Software Engineer 34332 Location: Chennai (Onsite) Job Type: Contract Budget: ₹20 LPA Notice Period: Immediate Joiners Only Role Overview We are looking for a highly skilled Senior Software Engineer to be a part of a centralized observability and monitoring platform team. The role focuses on building and maintaining a scalable, reliable observability solution that enables faster incident response and data-driven decision-making through latency, traffic, error, and saturation monitoring. This opportunity requires a strong background in cloud-native architecture, observability tooling, backend and frontend development, and data pipeline engineering. Key Responsibilities Design, build, and maintain observability and monitoring platforms to enhance MTTR/MTTX Create and optimize dashboards, alerts, and monitoring configurations using tools like Prometheus, Grafana, etc. Architect and implement scalable data pipelines and microservices for real-time and batch data processing Utilize GCP tools including BigQuery, Dataflow, Dataproc, Data Fusion, and others Develop end-to-end solutions using Spring Boot, Python, Angular, and REST APIs Design and manage relational and NoSQL databases including PostgreSQL, MySQL, and BigQuery Implement best practices in data governance, RBAC, encryption, and security within cloud environments Ensure automation and reliability through CI/CD, Terraform, and orchestration tools like Airflow and Tekton Drive full-cycle SDLC processes including design, coding, testing, deployment, and monitoring Collaborate closely with software architects, DevOps, and cross-functional teams for solution delivery Core Skills Required Proficiency in Spring Boot, Angular, Java, and Python Experience in developing microservices and SOA-based systems Cloud-native development experience, preferably on Google Cloud Platform (GCP) Strong understanding of HTML, CSS, JavaScript/TypeScript, and modern frontend frameworks Experience with infrastructure automation and monitoring tools Working knowledge of data engineering technologies: PySpark, Airflow, Apache Beam, Kafka, and similar Strong grasp of RESTful APIs, GitHub, and TDD methodologies Preferred Skills GCP Professional Certifications (e.g., Data Engineer, Cloud Developer) Hands-on experience with Terraform, Cloud SQL, Data Governance tools, and security frameworks Exposure to performance tuning, cost optimization, and observability best practices Experience Required 5+ years of experience in full-stack and cloud-based application development Strong track record in building distributed, scalable systems Prior experience with observability and performance monitoring tools is a plus Educational Qualifications Bachelor’s Degree in Computer Science, Information Technology, or a related field (mandatory) Skills: java,data fusion,html,dataflow,terraform,spring boot,restful apis,python,angular,dataproc,microservices,apache beam,css,cloud sql,soa,typescript,tdd,kafka,javascript,airflow,github,pyspark,bigquery,,gcp

Posted 1 week ago

Apply

6.0 years

5 - 9 Lacs

Chennai

On-site

Job Purpose As a Lead Software Development Engineer in Test (SDET) on the Viewpoint team at Trimble, you will lead the test automation strategy, execution, and process optimization for large-scale web and mobile applications. In this role, you will mentor junior SDETs, work closely with development and product teams, and ensure quality through continuous testing and automation best practices. You will be accountable for driving test automation across platforms (web, iOS, Android), defining scalable frameworks, and establishing CI/CD-integrated quality gates. Your contribution will be critical to ensuring smooth, high-quality releases for Trimble Viewpoint’s mission-critical enterprise software used in the global construction industry. What You Will Do Define, implement, and evolve the overall test automation strategy for the Viewpoint product suite Build and maintain scalable, reusable test automation frameworks using C# for web and Appium/Selenium for mobile (iOS/Android) Provide technical leadership to the SDET team, including reviewing test architecture, test cases, and automation code Champion quality-first principles across Agile teams and guide integration of testing into all stages of the development lifecycle Set up and manage cloud-based testing infrastructure using Sauce Labs, emulators/simulators, and physical devices Develop test strategies for API, functional, regression, performance, and cross-platform compatibility testing Lead root cause analysis of complex issues in coordination with development and QA teams Drive continuous improvements in test coverage, speed, and reliability across mobile and web Design dashboards and metrics to track test effectiveness, code coverage, and defect trends Collaborate with product managers, architects, and engineering leaders to align quality initiatives with business goals Help integrate test automation into CI/CD pipelines and maintain quality gates for every release Evaluate and recommend new tools, frameworks, and processes to improve automation and testing workflows Mentor junior SDETs and foster a high-performance quality culture within the engineering team What Skills & Experience You Should Have Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related technical field 6+ years of experience in software testing or SDET roles with at least 2+ years in a lead or senior QA/SDET capacity Advanced proficiency in test automation using C#, including frameworks like MSTest, NUnit, or xUnit Strong hands-on experience with Selenium, Appium, and mobile automation testing for iOS and Android Experience with Sauce Labs or similar device farms/cloud-based testing platforms Expertise in functional, regression, API, and performance testing Solid experience working in Agile teams, participating in sprint planning, estimations, and retrospectives Deep understanding of CI/CD pipelines, including integration of automated tests in build and deployment flows Prior experience with defect tracking systems (JIRA) and test case management tools (e.g., TestRail, Zephyr) Familiarity with testing RESTful services, backend workflows, and microservice architectures Excellent problem-solving skills, with a mindset for root-cause analysis and continuous improvement Strong verbal and written communication skills with the ability to influence stakeholders and drive quality initiatives Viewpoint – Engineering Context You will be part of the Trimble Viewpoint team building enterprise software solutions for construction management. Viewpoint’s technology stack includes: C#, ASP.NET (Core/Framework), Web API, Angular, OData, and Microsoft SQL Server Integration with Azure Functions, Azure Service Bus, Azure Storage, and Apache Kafka RESTful services, Microservices, and modern frontend technologies Enterprise-grade CI/CD pipelines and Agile workflows You’ll work alongside experienced full-stack engineers, product managers, and other QA professionals to deliver production-grade releases at scale. Reporting Structure This position reports to a Technical Project Manager or Engineering Manager within the Viewpoint organization. About Trimble Trimble is a technology company transforming the way the world works by delivering solutions that connect the physical and digital worlds. Core technologies in positioning, modeling, connectivity, and data analytics improve productivity, quality, safety, and sustainability across industries like construction, agriculture, transportation, and geospatial. Visit www.trimble.com to learn more. Trimble’s Inclusiveness Commitment We believe in celebrating our differences. Our diversity is our strength. We strive to build an inclusive workplace where everyone belongs and can thrive. Programs and practices at Trimble ensure individuals are seen, heard, welcomed—and most importantly—valued.

Posted 1 week ago

Apply

5.0 - 8.0 years

6 - 9 Lacs

Chennai

On-site

Job Purpose Design and develop end-to-end software solutions that power innovative products at Trimble. Leverage your expertise in C#, ASP.NET (Framework/Core), Web API, Angular, and Microsoft Azure services to build scalable and high-performance web applications. This role involves hands-on full-stack development, including responsive front-end UI, robust server-side logic, and secure, cloud-integrated backend services. You will work in an Agile team environment, collaborating with cross-functional teams to deliver impactful digital solutions while maintaining high code quality, performance, and security standards. Primary Responsibilities Understand high-level product and technical requirements and convert them into scalable full-stack software designs. Develop server-side applications using C#, ASP.NET Core/Framework, Web API, and Entity Framework. Build intuitive and responsive front-end interfaces using Angular, JavaScript, HTML, and CSS. Design, develop, and maintain RESTful APIs, including OData APIs, ensuring proper versioning and security. Integrate authentication and authorization mechanisms using industry standards. Work with Microsoft SQL Server for designing schemas, writing queries, and optimizing performance. Build microservices and modular web components adhering to best practices. Develop and deploy Azure Functions, utilize Azure Service Bus, and manage data using Azure Storage. Integrate with messaging systems such as Apache Kafka for distributed event processing. Contribute to CI/CD workflows, manage source control using Git, and participate in code reviews and team development activities. Write and maintain clean, well-documented, and testable code with unit and integration test coverage. Troubleshoot and resolve performance, scalability, and maintainability issues across the stack. Support production deployments and maintain operational excellence for released features. Stay current with evolving technologies and development practices to improve team efficiency and product quality. Skills and Background Strong proficiency in C# and .NET Framework 4.x / .NET Core Solid experience in ASP.NET MVC / ASP.NET Core, Web API, and Entity Framework / EF Core Knowledge of OData APIs, REST principles, and secure web communication practices Front-end development experience using JavaScript, Angular (preferred), HTML5, CSS3 Proficient with Microsoft SQL Server including query tuning, indexing, and stored procedures Experience with Authentication & Authorization (OAuth, JWT, Claims-based Security) Experience building microservices and using Web Services Hands-on with Azure Functions, Azure Service Bus, and Azure Storage Experience integrating and processing messages using Apache Kafka Knowledge of source control systems like Git, and experience in Agile development environments Exposure to unit testing frameworks, integration testing, and DevOps practices Ability to write clean, maintainable, and well-structured code Excellent problem-solving, debugging, and troubleshooting skills Strong communication and collaboration skills Work Experience 5–8 years of experience as a Full Stack Engineer or Software Developer Proven experience delivering scalable web applications and services in a production environment Experience in Agile/Scrum teams and cross-cultural collaboration Tier-1 or Tier-2 product company or equivalent high-performance team experience preferred Minimum Required Qualification Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related discipline from a Tier-1 or Tier-2 institute. Reporting The individual selected for this role will report to a Technical Project Manager, Engineering Manager, Engineering Director, or another designated leader within the division. About Trimble Dedicated to the world’s tomorrow, Trimble is a technology company delivering solutions that enable our customers to work in new ways to measure, build, grow and move goods for a better quality of life. Core technologies in positioning, modeling, connectivity, and data analytics connect the digital and physical worlds to improve productivity, quality, safety, transparency, and sustainability. From purpose-built products and enterprise lifecycle solutions to industry cloud services, Trimble is transforming critical industries such as construction, geospatial, agriculture, and transportation to power an interconnected world of work. For more information, visit: www.trimble.com Trimble’s Inclusiveness Commitment We believe in celebrating our differences. That is why our diversity is our strength. To us, that means actively participating in opportunities to be inclusive. Diversity, Equity, and Inclusion have guided our current success while also moving our desire to improve. We actively seek to add members to our community who represent our customers and the places we live and work. We have programs in place to ensure our people are seen, heard, and welcomed—and most importantly, that they know they belong, no matter who they are or where they come from.

Posted 1 week ago

Apply

3.0 years

4 - 10 Lacs

Chennai

Remote

Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. At Workday, we value our candidates’ privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About the Team If you thrive on tackling significant technical challenges, delivering scalable solutions for mission-critical platforms, and collaborating closely with world-class engineers, you will love being a part of our Technology Product Management team! You'll help to build the foundational services that power Workday's enterprise cloud, impacting millions of users globally. About the Role We’re looking for a Technical Product Manager who is deeply curious about complex distributed systems with a track record of driving innovation within established platforms. Above all, we are seeking a Product Manager who excels at driving technical strategy, making astute trade-offs that balance innovation with system stability, and translating complex technical requirements into actionable, engineering-ready roadmaps. Experience with AWS and data storage and retrieval technologies like Apache Parquet, and Apache Iceberg is a plus! If you are a natural collaborator and a great storyteller, capable of working seamlessly with senior engineering leaders and architects around the world, and love diving deep into the intricate details of distributed system design and implementation, we strongly encourage you to apply! About You Basic Qualifications 3+ years experience of technical product management. A college degree in Computer Science or an equivalent technical degree; or at least 5 years of proven experience at a software company in product management or a similar role Other Qualifications Always brings data-informed arguments to the forefront with SQL and Python-based data analysis Can get software developers to enthusiastically build on top of your product Flexible and adaptable to adapt to change Can design scalable, reliable, business-critical systems for large customers Experience with distributed processing and scheduling; indexing and search technologies; devops-related initiatives to improve developer experience, automation, and operational stability; or system health and monitoring Our Approach to Flexible Work With Flex Work, we’re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means you'll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process!

Posted 1 week ago

Apply

4.0 years

4 - 20 Lacs

Chennai

On-site

Job Summary: We are seeking an experienced and results-driven GCP Data Engineer with over 4 years of hands-on experience in building and optimizing data pipelines and architectures using Google Cloud Platform (GCP). The ideal candidate will have strong expertise in data integration, transformation, and modeling, with a focus on delivering scalable, efficient, and secure data solutions. This role requires a deep understanding of GCP services, big data processing frameworks, and modern data engineering practices. Key Responsibilities: Design, develop, and deploy scalable and reliable data pipelines on Google Cloud Platform . Build data ingestion processes from various structured and unstructured sources using Cloud Dataflow , Pub/Sub , BigQuery , and other GCP tools. Optimize data workflows for performance, reliability, and cost-effectiveness. Implement data transformations, cleansing, and validation using Apache Beam , Spark , or Dataflow . Work closely with data analysts, data scientists, and business stakeholders to understand data needs and translate them into technical solutions. Ensure data security and compliance with company and regulatory standards. Monitor, troubleshoot, and enhance data systems to ensure high availability and accuracy. Participate in code reviews, design discussions, and continuous integration/deployment processes. Document data processes, workflows, and technical specifications. Required Skills: Minimum 4 years of experience in data engineering with at least 2 years working on GCP . Strong proficiency in GCP services such as BigQuery , Cloud Storage , Dataflow , Pub/Sub , Cloud Composer , Cloud Functions , and Vertex AI (preferred). Hands-on experience in SQL , Python , and Java/Scala for data processing and transformation. Experience with ETL/ELT development, data modeling, and data warehousing concepts. Familiarity with CI/CD pipelines , version control (Git), and DevOps practices. Solid understanding of data security, IAM, encryption, and compliance within cloud environments. Experience with performance tuning, workload management, and cost optimization in GCP. Preferred Qualifications: GCP Professional Data Engineer Certification. Experience with real-time data processing using Kafka , Dataflow , or Pub/Sub . Familiarity with Terraform , Cloud Build , or infrastructure-as-code tools. Exposure to data quality frameworks and observability tools. Previous experience in an agile development environment. Job Types: Full-time, Permanent Pay: ₹473,247.51 - ₹2,000,000.00 per year Schedule: Monday to Friday Application Question(s): Mention Your Last Working Date Experience: Google Cloud Platform: 4 years (Preferred) Python: 4 years (Preferred) ETL: 4 years (Preferred) Work Location: In person

Posted 1 week ago

Apply

5.0 years

6 - 8 Lacs

Chennai

On-site

As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to Ford Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for Ford Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. GCP certified Professional Data Engineer Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications to production-scale solutions. In-depth understanding of GCP’s underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing and deploying microservices architectures leveraging container orchestration frameworks Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Master’s degree in computer science, software engineering, information systems, Data Engineering, or a related field. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience using data science concepts on production datasets to generate insights Design and build production data engineering solutions on Google Cloud Platform (GCP) using services such as BigQuery, Dataflow, DataForm, Astronomer, Data Fusion, DataProc, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Artifact Registry, GCP APIs, Cloud Build, App Engine, and real-time data streaming platforms like Apache Kafka and GCP Pub/Sub. Design new solutions to better serve AI/ML needs. Lead teams to expand our AI-enabled services. Partner with governance teams to tackle key business needs. Collaborate with stakeholders and cross-functional teams to gather and define data requirements and ensure alignment with business objectives. Partner with analytics teams to understand how value is created using data. Partner with central teams to leverage existing solutions to drive future products. Design and implement batch, real-time streaming, scalable, and fault-tolerant solutions for data ingestion, processing, and storage. Create insights into existing data to fuel the creation of new data products. Perform necessary data mapping, impact analysis for changes, root cause analysis, and data lineage activities, documenting information flows. Implement and champion an enterprise data governance model. Actively promote data protection, sharing, reuse, quality, and standards to ensure data integrity and confidentiality. Develop and maintain documentation for data engineering processes, standards, and best practices. Ensure knowledge transfer and ease of system maintenance. Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures. Provide production support by addressing production issues as per SLAs. Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Work within an agile product team. Deliver code frequently using Test-Driven Development (TDD), continuous integration, and continuous deployment (CI/CD). Continuously enhance your domain knowledge. Stay current on the latest data engineering practices. Contribute to the company's technical direction while maintaining a customer-centric approach.

Posted 1 week ago

Apply

3.0 - 5.0 years

7 - 10 Lacs

Chennai

On-site

TransUnion's Job Applicant Privacy Notice What We'll Bring: TransUnion is a global information and insights company that makes trust possible in the modern economy. We do this by providing a comprehensive picture of each person so they can be reliably and safely represented in the marketplace. As a result, businesses and consumers can transact with confidence and achieve great things. We call this Information for Good.® A leading presence in more than 30 countries across five continents, TransUnion provides solutions that help create economic opportunity, great experiences and personal empowerment for hundreds of millions of people. What You'll Bring: As consultant on our team, you will join a global group of statisticians, data scientists, and industry experts on a mission to extract insights from data and put them to good use. You will have an opportunity to be a part of a variety of analytical projects in a collaborative environment and be recognized for the work you deliver. TransUnion offers a culture of lifelong learning and as an associate here, your growth potential is limitless. The consultant role within the Research and Consulting team is responsible for delivering market-level business intelligence both to TransUnion’s senior management and to Financial Services customers. You will work on projects across international markets, including Canada, Hong Kong, UK, South Africa, Philippines, and Colombia. To be successful in this position, you must have good organizational skills, a strategic mindset, and a flexible predisposition. You will also be expected to operate independently and able to lead and present projects with minimal supervision. How You’ll Contribute: You will develop a strong understanding of consumer credit data and how it applies to industry trends and research across different international markets You will dig in by extracting data and performing segmentation and statistical analyses on large population datasets (using languages such as R, SQL, and Python on Linux and PC computing platforms) You will conduct analyses and quantitative research studies designed to understand complex industry trends and dynamics, leveraging a variety of statistical techniques You will deliver analytic insights and recommendations in succinct and compelling presentations for internal and external customers at various levels including an executive audience; you may lead key presentations to clients You will perform multiple tasks simultaneously and deal with changing requirements and deadlines You will develop strong consulting skills to be able to help external customers by understanding their business needs and aligning them with TransUnion’s product offerings and capabilities You will help to cultivate an environment that promotes excellence, innovation, and a collegial spirit Through all these efforts, you will be a key contributor to driving the perception of TransUnion as an authority on lending dynamics and a worthwhile, trusted partner to our clients and prospects Impact You'll Make: What you'll bring: A Bachelor’s or Master’s degree in Statistics, Applied Mathematics, Operations Research, Economics, or an equivalent discipline Minimum 3-5 years of experience in a relevant field, such as data analytics, lending, or risk strategy Advanced proficiency with one or more statistical programming languages such as R Advanced proficiency writing SQL queries for data extraction Experience with big data platforms (e.g. Apache Hadoop, Apache Spark) preferred Advanced experience with the MS Office suite, particularly Word, Excel, and PowerPoint Strong time management skills with the ability to prioritize and contribute to multiple assignments simultaneously Excellent verbal and written communication skills. You must be able to clearly articulate ideas to both technical and non-technical audiences Highly analytical mindset with the curiosity to dig deeper into data, trends, and consumer behavior A strong interest in the areas of banking, consumer lending, and finance is paramount, with a curiosity as to why consumers act the way they do with their credit Strong work ethic with the passion for team success This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Consultant, Research & Consulting

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Chennai

On-site

Mandatory Skills: 4-6 years of exp with basic proficiency in Python, SQL and familiarity with libraries like NumPy or Pandas. Understanding of fundamental programming concepts (data structures, algorithms, etc.). Eagerness to learn new tools and frameworks, including Generative AI technologies. Familiarity with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Exposure to data processing tools like Apache Spark or PySpark, SQL. Basic understanding of APIs and how to integrate them. Interest in AI/ML and willingness to explore frameworks like LangChain. Familiarity with cloud platforms (AWS, Azure, or GCP) is a plus Job Description: We are seeking a motivated Python Developer to join our team. The ideal candidate will have a foundational understanding of Python programming, SQL and a passion for learning and growing in the field of software development. You will work closely with senior developers and contribute to building and maintaining applications, with opportunities to explore Generative AI frameworks and data processing tools. Key Responsibilities: Assist in developing and maintaining Python-based applications. Write clean, efficient, and well-documented code. Collaborate with senior developers to integrate APIs and frameworks. Support data processing tasks using libraries like Pandas or PySpark. Learn and work with Generative AI frameworks (e.g., LangChain, LangGraph) under guidance. Debug and troubleshoot issues in existing applications. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 week ago

Apply

8.0 - 10.0 years

4 - 5 Lacs

Chennai

On-site

Mandatory Skills: 8-10 years of exp. Strong proficiency in Python, SQL and experience with data processing libraries (e.g., Pandas, PySpark). Familiarity with Generative AI frameworks like LangChain, LangGraph, or similar tools. Experience integrating APIs from pre-trained AI models (e.g., OpenAI, Cohere, Hugging Face). Solid understanding of data structures, algorithms, and distributed systems. Experience with vector databases (e.g., Pinecone, Postgres). Familiarity with prompt engineering and chaining AI workflows. Understanding of MLOps practices for deploying and monitoring AI applications. Strong problem-solving skills and ability to work in a collaborative environment. Good to have: Experience with Streamlit to build application front-end. Job Description We are looking for an experienced Python Developer with expertise in Spark, SQL, data processing, and building Generative AI applications. The ideal candidate will focus on leveraging existing AI models and frameworks (e.g., LangChain, LangGraph) to create innovative, data-driven solutions. This role does not involve designing new AI models but rather integrating and utilizing pre-trained models to solve real-world problems. Key Responsibilities: Develop and deploy Generative AI applications using Python and frameworks like LangChain or LangGraph. Work with large-scale data processing frameworks like Apache Spark, SQL to prepare and manage data pipelines. Integrate pre-trained AI models (e.g., OpenAI, Hugging Face, Llama) into scalable applications. Understands ML - NLP concepts and algorithms with an exposure to Scikit-learn Pytorch. Collaborate with data engineers and product teams to design AI-driven solutions. Optimize application performance and ensure scalability in production environments. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 week ago

Apply

3.0 years

1 - 4 Lacs

Noida

On-site

Job Title: CloudStack Administrator Location: Noida Department: IT Infrastructure / Cloud Services Reports To: IT Manager / Cloud Operations Head Experience: 3–7 years Salary – 20k -40 k per month Job Summary: We are seeking a skilled CloudStack Administrator to manage and support our private/public/hybrid cloud infrastructure using Apache CloudStack. The ideal candidate will have hands-on experience in deploying, maintaining, and optimizing CloudStack-based environments and ensuring system performance, availability, and scalability. Key Responsibilities: Install, configure, and manage Apache CloudStack environments. Maintain virtual machines, templates, volumes, snapshots, and zones. Monitor system performance and proactively address potential issues. Apply updates, patches, and security configurations to infrastructure components. Manage cloud storage (Ceph, NFS, etc.), hypervisors (KVM/Xen/VMware), and networking (VLAN, SDN). Troubleshoot system issues and provide root cause analysis. Integrate with authentication systems (LDAP/Active Directory) and automation tools (Ansible, Terraform). Coordinate with development and DevOps teams to support deployment requirements. Implement backup, disaster recovery, and high-availability strategies. Document infrastructure, processes, and procedures. Ensure compliance with internal and external security standards and policies. Required Skills & Qualifications: 3+ years of hands-on experience with Apache CloudStack. Strong knowledge of Linux (RHEL/CentOS/Ubuntu) system administration. Experience with hypervisors (KVM preferred, Xen or VMware optional). Good understanding of cloud networking concepts (VLANs, firewalls, load balancers). Familiarity with storage systems (Ceph, NFS, iSCSI). Knowledge of scripting (Bash, Python, etc.) for automation. Experience with monitoring tools (Zabbix, Nagios, Prometheus, etc.). Familiarity with APIs and integration methods for cloud platforms. Excellent troubleshooting, communication, and documentation skills. Interested ones can share their resume at bhumika.kukreti@netforchoice.com or 9310405492 Job Type: Full-time Pay: ₹10,143.83 - ₹40,000.00 per month Benefits: Health insurance Provident Fund Schedule: Day shift Work Location: In person Expected Start Date: 06/08/2025

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies