Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
10 Lacs
Noida
On-site
We, at Beesolver Technologies looking for an experienced Senior Laravel/PHP Developer with a strong foundation in backend development, MySQL, and e-commerce systems. The ideal candidate will be responsible for designing scalable applications, integrating data pipelines, and driving feature delivery in a collaborative Agile team setup. Job Title: PHP Laravel Developer – Ecommerce & Data Integration Job Type: Full-Time Experience: 5+ Years Location: Noida/Chd.-Mohali WORK TIMING: 10 am-7 PM Industry: IT Services / E-commerce / SaaS Functional Area: Software Development Roles and Responsibilities: Develop secure and scalable backend applications using Laravel (PHP) Design, optimize, and manage MySQL schemas and performance tuning Build and maintain ETL/ESB pipelines for data synchronization across systems Work with Laravel queues, events, jobs, and scheduler Develop REST APIs and manage third-party integrations (shipping, CRM, payments) Collaborate with cross-functional Agile teams: developers, testers, product owners Implement and follow best practices in code quality, testing, and CI/CD Technical Skills: 5+ years of PHP development experience (3+ years in Laravel) Strong experience with MySQL (joins, indexes, stored procedures, tuning) ETL/ESB knowledge using Laravel Jobs, Talend, or Apache NiFi Skilled in REST API design, integration, and OAuth/Webhooks Ecommerce Domain Knowledge Hands-on experience with major ecommerce platforms Strong understanding of ecommerce business processes including: Product catalogs, variants, and SKU management Order lifecycle management (cart, checkout, order placement, payment, fulfilment, return/refund) Inventory management, stock sync, and warehouse integration Shipping and logistics API integrations ERP and CRM system integration for unified data flow Knowledge of ecommerce KPIs and data reporting (RFM, CLTV, conversion rate) Preferred Skills (Good to Have) Experience with RabbitMQ, Kafka, or any messaging system Exposure to Talend, Apache NiFi, or Pentaho Familiarity with DDD, clean/hexagonal architecture patterns Basic experience on cloud platforms: AWS, Azure, or GCP Strong experience with MySQL (joins, indexes, stored procedures, tuning) ETL/ESB knowledge using Laravel Jobs, Talend, or Apache NiFi Skilled in REST API design, integration, and OAuth/Webhooks Education: UG: B. Tech/B.E. in Computer Science, IT or BCA PG: MCA, M. Tech (preferred) Job Types: Full-time, Permanent Pay: Up to ₹89,785.33 per month Benefits: Paid time off Provident Fund Schedule: Day shift Supplemental Pay: Yearly bonus Work Location: In person
Posted 13 hours ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Overview TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Software Engineer Senior Location: Chennai Work Type: Hybrid Position Description This fast-paced job position is intended for people who like to build analytics platforms and tooling which deliver real value to the business. Applicants should have a strong desire to learn new technologies and be interested in providing guidance which will help drive the adoption of these tools. The Analytics Data Management (ADM) Product Engineer will assist with the engineering of strategic data management platforms from Informatica, primarily Enterprise Data Catalog and Apache NiFi. Other technologies include Informatica: IICS/IDMC, PowerCenter, Data Catalog, Master Data Management; IBM: Information Server, Cloud Pak for Data (CP4D); Google: Cloud Data Fusion. This person will also collaborate with Infrastructure Architects to design and implement environments based on these technologies for use in the client's enterprise data centers. Platforms may be based on-premises, or hosted in Google Cloud offering Skills Required Informatica Skills Preferred Cloud Infrastructure Experience Required Informatica: IICS/IDMC, PowerCenter, Data Catalog, Master Data Management; Apache NiFi Experience Required Informatica Products: Installation, configuration, administration, and troubleshooting. Specific experience with Informatica Data Catalog is essential. Apache Nifi: Strong Java development experience to create custom NiFi processors and expertise in deploying and managing NiFi applications on Red Hat OS environments. Google Cloud Platform (GCP): Provisioning, administration, and troubleshooting of products. Specific experience with DataPlex or Google Cloud Data Fusion (CDF) is highly preferred. Experience Range 5-8 years Experience Preferred Summary of Responsibilities: Engineer, test, and modernize data management platforms primarily Informatica Enterprise Data Catalog and Apache NiFi. Enable cloud migrations for Analytics platforms Define, document, and monitor global (Follow-the-Sun) support procedures (Incident Management, Request Management, Event Management, etc). Provide Asia-Pacific (IST) 2nd level support for these products. Responsibilities Detail Installing and configuring products, Working with platform teams support to resolve issues, Working with vendor support to resolve issues, Thoroughly testing product functionality on the platform; Developing custom installation guides, configurations, and scripts that are consistent with the client's IT security policy; Providing 2nd level support regarding product related issues; Developing new tools and processes to ensure effective implementation and use of the technologies. Implementing, monitoring/alerting, and analyzing usage data to ensure optimal performance of the infrastructure. Maintaining a SharePoint site with relevant documentation, FAQs, processes, etc. necessary to promote and support the use of these technologies. Required Skills Ability collect and clearly document requirements. Ability to prioritize work and manage multiple assignments. Ability to create & execute detailed project plans and test plans. Education Required Bachelor's Degree Education Preferred Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.
Posted 1 day ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Us: Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. Job Summary: Build systems for collection & transformation of complex data sets for use in production systems Collaborate with engineers on building & maintaining back-end services Implement data schema and data management improvements for scale and performance Provide insights into key performance indicators for the product and customer usage Serve as team's authority on data infrastructure, privacy controls and data security Collaborate with appropriate stakeholders to understand user requirements Support efforts for continuous improvement, metrics and test automation Maintain operations of live service as issues arise on a rotational, on-call basis Verify whether data architecture meets security and compliance requirements and expectations .Should be able to fast learn and quickly adapt at rapid pace. java/scala, SQL, Minimum Qualifications: Bachelor's degree in computer science, computer engineering or a related field, or equivalent experience 3+ years of progressive experience demonstrating strong architecture, programming and engineering skills. Firm grasp of data structures, algorithms with fluency in programming languages like Java, Python, Scala. Strong SQL language and should be able to write complex queries. Strong Airflow like orchestration tools. Demonstrated ability to lead, partner, and collaborate cross functionally across many engineering organizations Experience with streaming technologies such as Apache Spark, Kafka, Flink. Backend experience including Apache Cassandra, MongoDB and relational databases such as Oracle, PostgreSQL AWS/GCP solid hands on with 4+ years of experience. Strong communication and soft skills. Knowledge and/or experience with containerized environments, Kubernetes, docker. Experience in implementing and maintained highly scalable micro services in Rest, Spring Boot, GRPC. Appetite for trying new things and building rapid POCs" Key Responsibilities : Design, develop, and maintain scalable data pipelines to support data ingestion, processing, and storage Implement data integration solutions to consolidate data from multiple sources into a centralized data warehouse or data lake Collaborate with data scientists and analysts to understand data requirements and translate them into technical specifications Ensure data quality and integrity by implementing robust data validation and cleansing processes Optimize data pipelines for performance, scalability, and reliability. Develop and maintain ETL (Extract, Transform, Load) processes using tools such as Apache Spark, Apache NiFi, or similar technologies .Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal downtimeImplement best practices for data management, security, and complianceDocument data engineering processes, workflows, and technical specificationsStay up-to-date with industry trends and emerging technologies in data engineering and big data. Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 25 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story!
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Role: We are seeking a highly skilled and experienced Data Architect with expertise in designing and building data platforms in cloud environments. The ideal candidate will have a strong background in either AWS Data Engineering or Azure Data Engineering, along with proficiency in distributed data processing systems like Spark. Additionally, proficiency in SQL, data modeling, building data warehouses, and knowledge of ingestion tools and data governance are essential for this role. The Data Architect will also need experience with orchestration tools such as Airflow or Dagster and proficiency in Python, with knowledge of Pandas being beneficial. Why Choose Ideas2IT Ideas2IT has all the good attributes of a product startup and a services company. Since we launch our products, you will have ample opportunities to learn and contribute. However, single-product companies stagnate in the technologies they use. In our multiple product initiatives and customer-facing projects, you will have the opportunity to work on various technologies. AGI is going to change the world. Big companies like Microsoft are betting heavily on this (see here and here). We are following suit. What’s in it for you? You will get to work on impactful products instead of back-office applications for the likes of customers like Facebook, Siemens, Roche, and more You will get to work on interesting projects like the Cloud AI platform for personalized cancer treatment Opportunity to continuously learn newer technologies Freedom to bring your ideas to the table and make a difference, instead of being a small cog in a big wheel Showcase your talent in Shark Tanks and Hackathons conducted in the company Here’s what you’ll bring Experience in designing and building data platforms in any cloud. Strong expertise in either AWS Data Engineering or Azure Data Engineering Develop and optimize data processing pipelines using distributed systems like Spark. • Create and maintain data models to support efficient storage and retrieval. Build and optimize data warehouses for analytical and reporting purposes, utilizing technologies such as Postgres, Redshift, Snowflake, etc. Knowledge of ingestion tools such as Apache Kafka, Apache Nifi, AWS Glue, or Azure Data Factory. Establish and enforce data governance policies and procedures to ensure data quality and security. Utilize orchestration tools like Airflow or Dagster to schedule and manage data workflows. Develop scripts and applications in Python to automate tasks and processes. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Communicate technical solutions effectively to clients and stakeholders. Familiarity with multiple cloud ecosystems such as AWS, Azure, and Google Cloud Platform (GCP). Experience with containerization and orchestration technologies like Docker and Kubernetes. Knowledge of machine learning and data science concepts. Experience with data visualization tools such as Tableau or Power BI. Understanding of DevOps principles and practices. About Us: Ideas2IT stands at the intersection of Technology, Business, and Product Engineering, offering high-caliber Product Development services. Initially conceived as a CTO consulting firm, we've evolved into thought leaders in cutting-edge technologies such as Generative AI, assisting our clients in embracing innovation. Our forte lies in applying technology to address business needs, demonstrated by our track record of developing AI-driven solutions for industry giants like Facebook, Bloomberg, Siemens, Roche, and others. Harnessing our product-centric approach, we've incubated several AI-based startups—including Pipecandy, Element5, IdeaRx, and Carefi. in—that have flourished into successful ventures backed by venture capital. With fourteen years of remarkable growth behind us, we're steadfast in pursuing ambitious objectives.
Posted 2 days ago
5.0 - 7.0 years
4 - 10 Lacs
Hyderābād
On-site
Description The U.S. Pharmacopeial Convention (USP) is an independent scientific organization that collaborates with the world's top authorities in health and science to develop quality standards for medicines, dietary supplements, and food ingredients. USP's fundamental belief that Equity = Excellence manifests in our core value of Passion for Quality through our more than 1,300 hard-working professionals across twenty global locations to deliver the mission to strengthen the supply of safe, quality medicines and supplements worldwide. At USP, we value inclusivity for all. We recognize the importance of building an organizational culture with meaningful opportunities for mentorship and professional growth. From the standards we create, the partnerships we build, and the conversations we foster, we affirm the value of Diversity, Equity, Inclusion, and Belonging in building a world where everyone can be confident of quality in health and healthcare. USP is proud to be an equal employment opportunity employer (EEOE) and affirmative action employer. We are committed to creating an inclusive environment in all aspects of our work—an environment where every employee feels fully empowered and valued irrespective of, but not limited to, race, ethnicity, physical and mental abilities, education, religion, gender identity, and expression, life experience, sexual orientation, country of origin, regional differences, work experience, and family status. We are committed to working with and providing reasonable accommodation to individuals with disabilities. Brief Job Overview The Digital & Innovation group at USP is seeking a Full Stack Developers with programming skills in Cloud technologies to be able to build innovative digital products. We are seeking someone who understands the power of Digitization and help drive an amazing digital experience to our customers. How will YOU create impact here at USP? In this role at USP, you contribute to USP's public health mission of increasing equitable access to high-quality, safe medicine and improving global health through public standards and related programs. In addition, as part of our commitment to our employees, Global, People, and Culture, in partnership with the Equity Office, regularly invests in the professional development of all people managers. This includes training in inclusive management styles and other competencies necessary to ensure engaged and productive work environments. The Sr. Software Engineer/Software Engineer has the following responsibilities: Build scalable applications/ platforms using cutting edge cloud technologies. Constantly review and upgrade the systems based on governance principles and security policies. Participate in code reviews, architecture discussions, and agile development processes to ensure high-quality, maintainable, and scalable code. Document and communicate technical designs, processes, and solutions to both technical and non-technical stakeholders Who is USP Looking For? The successful candidate will have a demonstrated understanding of our mission, commitment to excellence through inclusive and equitable behaviors and practices, ability to quickly build credibility with stakeholders, along with the following competencies and experience: Education Bachelor's or Master's degree in Computer Science, Engineering, or a related field Experience Sr. Software Engineer: 5-7 years of experience in software development, with a focus on cloud computing Software Engineer: 2-4 years of experience in software development, with a focus on cloud computing Strong knowledge of cloud platforms (e.g., AWS , Azure, Google Cloud) and services, including compute, storage, networking, and security Extensive knowledge on Java spring boot applications and design principles. Strong programming skills in languages such as Python Good experience with AWS / Azure services, such as EC2, S3, IAM, Lambda, RDS, DynamoDB, API Gateway, and Cloud Formation Knowledge of cloud architecture patterns, best practices, and security principles Familiarity with data pipeline / ETL / Orchestration tools, such as Apache NiFi, AWS Glue, or Apache Airflow. Good experience with front end technologies like React.js/Node.js etc Strong experience in micro services, automated testing practices. Experience leading initiatives related to continuous improvement or implementation of new technologies. Works independently on most deliverables Strong analytical and problem-solving skills, with the ability to develop creative solutions to complex problems Ability to manage multiple projects and priorities in a fast-paced, dynamic environment Additional Desired Preferences Experience with scientific chemistry nomenclature or prior work experience in life sciences, chemistry, or hard sciences or degree in sciences Experience with pharmaceutical datasets and nomenclature Experience with containerization technologies, such as Docker and Kubernetes, is a plus Experience working with knowledge graphs Ability to explain complex technical issues to a non-technical audience Self-directed and able to handle multiple concurrent projects and prioritize tasks independently Able to make tough decisions when trade-offs are required to deliver results Strong communication skills required: Verbal, written, and interpersonal Supervisory Responsibilities No Benefits USP provides the benefits to protect yourself and your family today and tomorrow. From company-paid time off and comprehensive healthcare options to retirement savings, you can have peace of mind that your personal and financial well-being is protected Who is USP? The U.S. Pharmacopeial Convention (USP) is an independent scientific organization that collaborates with the world's top authorities in health and science to develop quality standards for medicines, dietary supplements, and food ingredients. USP's fundamental belief that Equity = Excellence manifests in our core value of Passion for Quality through our more than 1,300 hard-working professionals across twenty global locations to deliver the mission to strengthen the supply of safe, quality medicines and supplements worldwide. At USP, we value inclusivity for all. We recognize the importance of building an organizational culture with meaningful opportunities for mentorship and professional growth. From the standards we create, the partnerships we build, and the conversations we foster, we affirm the value of Diversity, Equity, Inclusion, and Belonging in building a world where everyone can be confident of quality in health and healthcare. USP is proud to be an equal employment opportunity employer (EEOE) and affirmative action employer. We are committed to creating an inclusive environment in all aspects of our work—an environment where every employee feels fully empowered and valued irrespective of, but not limited to, race, ethnicity, physical and mental abilities, education, religion, gender identity, and expression, life experience, sexual orientation, country of origin, regional differences, work experience, and family status. We are committed to working with and providing reasonable accommodation to individuals with disabilities.
Posted 2 days ago
2.0 years
4 - 7 Lacs
Chennai
On-site
Information Developer - OSS Assurance This role has been designed as ‘’Onsite’ with an expectation that you will primarily work from an HPE office. Who We Are: Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description: Aruba is an HPE Company, and a leading provider of next-generation network access solutions for the mobile enterprise. Helping some of the largest companies in the world modernize their networks to meet the demands of a digital future, Aruba is redefining the “Intelligent Edge” – and creating new customer experiences across intelligent spaces and digital workspaces. Join us redefine what’s next for you. Job Family Definition: Applies specialized knowledge to conceptualize, design, develop, unit-test, configure, and implement portions of new or enhanced (upgrades or conversions) business and technical software solutions through application of appropriate standard software development life cycle methodologies and processes. Interacts with the Client and project roles (e.g., Project Manager, Business Analyst, Data Engineer) as required, to gain an understanding of the business environment, technical context, and organizational strategic direction. Defines scope, plans, and deliverables for assigned components. Understands and uses appropriate tools to analyze, identify, and resolve business and or technical problems. Applies metrics to monitor performance and measure key project parameters. Prepares system documentation. Conforms to security and quality standards. Stays current on emerging tools, techniques, and technologies. Management Level Definition: Contributions include applying intermediate level of subject matter expertise to solve common technical problems. Acts as an informed team member providing analysis of information and recommendations for appropriate action. Works independently within an established framework and with moderate supervision. What you’ll do: Responsibilities: Participates as a member of development team. Completes development of units with designs prepared by more senior developers. Participates in code reviews. Prepares and executes unit tests. Applies growing technical knowledge to maintain a technology area (e.g. Web- site Development). May perform unit design. Applies company and 3rd party technologies to software solutions of moderate complexity. Configures end-user or enterprise systems designed by more senior technologists. Education and Experience Required: Typically a technical Bachelor's degree or equivalent experience and a minimum of 2 years of related experience What you need to bring: Knowledge and Skills: Must have a minimum of 2 years of relevant experience in Telecom domain, preferably in Operations Support System. Should have experience NIFI, Telegraf, Java, Python, KS8, REST API Possess hands on experience on Linux OS and Linux tools such as grep, sed, awk Needs to have good understanding about infra (VM Ware, RHV), Database (Oracle/Postgres etc.) Experience in OSS product portfolio such as HPE vTEMIP, UCA, UTM, IBM Tivoli, Netcool etc will be an added advantage. Good Communication skills (Written and Verbal) is mandatory. Additional Skills: Accountability, Accountability, Action Planning, Active Learning (Inactive), Active Listening, Bias, Business, Business Growth, Business Planning, Coaching, Commercial Acumen, Creativity, Critical Thinking, Cross-Functional Teamwork, Customer Experience Strategy, Data Analysis Management, Data Collection Management (Inactive), Data Controls, Deliverables Management, Design Thinking, Development Methodologies, Empathy, Follow-Through, Growth Mindset, Intellectual Curiosity (Inactive) {+ 8 more} What We Can Offer You: Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected: Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #aruba Job: Services Job Level: TCP_02 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.
Posted 2 days ago
4.0 - 9.0 years
4 - 5 Lacs
Noida, Uttar Pradesh, India
On-site
Application Support Engineer (Python+Nifi) , Exp. 3+ Yrs Location: Chennai/Noida/Mumbai CTC: 14-18 LPA, Python -code reading, should be able to troubleshoot, analyse logs, support applications coordinate with development teams for issue resolution.
Posted 2 days ago
10.0 years
0 Lacs
Greater Kolkata Area
Remote
Java Back End Engineer with AWS Location : Remote Experience : 10+ Years Employment Type : Full-Time Job Overview We are looking for a highly skilled Java Back End Engineer with strong AWS cloud experience to design and implement scalable backend systems and APIs. You will work closely with cross-functional teams to develop robust microservices, optimize database performance, and contribute across the tech stack, including infrastructure automation. Core Responsibilities Design, develop, and deploy scalable microservices using Java, J2EE, Spring, and Spring Boot. Build and maintain secure, high-performance APIs and backend services on AWS or GCP. Use JUnit and Mockito to ensure test-driven development and maintain code quality. Develop and manage ETL workflows using tools like Pentaho, Talend, or Apache NiFi. Create High-Level Design (HLD) and architecture documentation for system components. Collaborate with cross-functional teams (DevOps, Frontend, QA) as a full-stack contributor when needed. Tune SQL queries and manage performance on MySQL and Amazon Redshift. Troubleshoot and optimize microservices for performance and scalability. Use Git for source control and participate in code reviews and architectural discussions. Automate infrastructure provisioning and CI/CD processes using Terraform, Bash, and pipelines. Primary Skills Languages & Frameworks : Java (v8/17/21), Spring Boot, J2EE, Servlets, JSP, JDBC, Struts Architecture : Microservices, REST APIs Cloud Platforms : AWS (EC2, S3, Lambda, RDS, CloudFormation, SQS, SNS) or GCP Databases : MySQL, Redshift Secondary Skills (Good To Have) Infrastructure as Code (IaC) : Terraform Additional Languages : Python, Node.js Frontend Frameworks : React, Angular, JavaScript ETL Tools : Pentaho, Talend, Apache NiFi (or equivalent) CI/CD & Containers : Jenkins, GitHub Actions, Docker, Kubernetes Monitoring/Logging : AWS CloudWatch, DataDog Scripting : Bash, Shell scripting Nice To Have Familiarity with agile software development practices Experience in a cross-functional engineering environment Exposure to DevOps culture and tools (ref:hirist.tech)
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a senior-level Data Engineer with Machine Learning Analyst capabilities, you will play a crucial role in leading the architecture, development, and management of scalable data solutions. Your expertise in data architecture, big data pipeline development, and data quality enhancement will be key in processing large-scale datasets and supporting machine learning workflows. Your key responsibilities will include designing, developing, and maintaining end-to-end data pipelines for ingestion, transformation, and delivery across various business systems. You will ensure robust data quality, data lineage, data reconciliation, and governance practices. Additionally, you will architect and manage data warehouse and big data solutions supporting both structured and unstructured data. Optimizing and automating ETL/ELT processes for high-volume data environments will be essential, with a focus on processing 5B+ records. Collaborating with data scientists and analysts to support machine learning workflows and implementing streamlined DAAS workflows will also be part of your role. To succeed in this position, you must have at least 10 years of experience in data engineering, including data architecture and pipeline development. Your proven experience with Spark and Hadoop clusters for processing large-scale datasets, along with a strong understanding of ETL frameworks, data quality processes, and automation best practices, will be critical. Experience in data ingestion, lineage, governance, and reconciliation, as well as a solid understanding of data warehouse design principles and data modeling, are must-have skills. Expertise in automated data processing, especially for DAAS platforms, is essential. Desirable skills for this role include experience with Apache HBase, Apache NiFi, and other Big Data tools, knowledge of distributed computing principles and real-time data streaming, familiarity with machine learning pipelines and supporting data structures, and exposure to data cataloging and metadata management tools. Proficiency in Python, Scala, or Java for data engineering tasks is also beneficial. In addition to technical skills, soft skills such as a strong analytical and problem-solving mindset, excellent communication skills for collaboration across technical and business teams, and the ability to work independently, manage multiple priorities, and lead data initiatives are required. If you are excited about the opportunity to work as a Data Engineer with Machine Learning Analyst capabilities and possess the necessary skills and experience, we look forward to receiving your application.,
Posted 2 days ago
10.0 - 13.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Aeris: For more than three decades, Aeris has been a trusted cellular IoT leader enabling the biggest IoT programs and opportunities across Automotive, Utilities and Energy, Fleet Management and Logistics, Medical Devices, and Manufacturing. Our IoT technology expertise serves a global ecosystem of 7,000 enterprise customers and 30 mobile network operator partners, and 80 million IoT devices across the world. Aeris powers today’s connected smart world with innovative technologies and borderless connectivity that simplify management, enhance security, optimize performance, and drive growth. Built from the ground up for IoT and road-tested at scale, Aeris IoT Services are based on the broadest technology stack in the industry, spanning connectivity up to vertical solutions. As veterans of the industry, we know that implementing an IoT solution can be complex, and we pride ourselves on making it simpler. Our company is in an enviable spot. We’re profitable, and both our bottom line and our global reach are growing rapidly. We’re playing in an exploding market where technology evolves daily and new IoT solutions and platforms are being created at a fast pace. A few things to know about us: We put our customers first . When making decisions, we always seek to do what is right for our customer first, our company second, our teams third, and individual selves last We do things differently. As a pioneer in a highly competitive industry that is poised to reshape every sector of the global economy, we cannot fall back on old models. Rather, we must chart our own path and strive to out-innovate, out-learn, out-maneuver and out-pace the competition on the way We walk the walk on diversity. We’re a brilliant and eclectic mix of ethnicities, religions, industry experiences, sexual orientations, generations and more – and that’s by design. We see diverse perspectives as a core competitive advantage Integrity is essential. We believe in doing things well – and doing them right. Integrity is a core value here: you’ll see it embodied in our staff, our management approach and growing social impact work (we have a VP devoted to it). You’ll also see it embodied in the way we manage people and our HR issues: we expect employees and managers to deal with issues directly, immediately and with the utmost respect for each other and for the Company We are owners. Strong managers enable and empower their teams to figure out how to solve problems. You will be no exception, and will have the ownership, accountability and autonomy needed to be truly creative Position Title: Senior Tech Lead Experience Required: 10-13 years Location : Noida Job Description Essential Duties & Responsibilities: Research, design and development of next generation Applications supporting billions of transactions and data. Design and development of cloud-based solutions with extensive hand-on experience on Big Data, distributed programming, ETL workflows and orchestration tools. Design and development of microservices in Java / J2EE, Node JS, with experience in containerization using Docker, Kubernetes Focus should be on developing cloud native applications utilizing cloud services. Work with product managers/owners & internal as well as external customers following Agile methodology Practice rapid iterative product development to mature promising concepts into successful products. Execute with a sense of urgency to drive ideas into products through the innovation life-cycle, , demo/evangelize. Should be experienced in using GenAI for faster development Skills Required Proven experience of development of high performing, scalable cloud applications using various cloud development stacks & services. Proven experience of Containers, GCP, AWS Cloud platforms. Deep skills in Java / Python / Node JS / SQL / PLSQL Working experience with Spring boot, ORM, JPA, Transaction Management, Concurrency, Design Patterns. Good Understanding of NoSQL databases like MongoDB. Experience on workflow and orchestration tools like NiFi, Airflow would be big plus. Deep understanding of best design and software engineering practices design principles and patterns, unit testing, performance engineering. Good understanding Distributed Architecture, plug-in and APIs. Prior Experience with Security, Cloud, and Container Security possess great advantage. Hands-on experience in building applications on various platforms, with deep focus on usability, performance and integration with downstream REST Web services. Exposure to Generative AI models, prompt engineering, or integration with APIs like OpenAI, Cohere, or Google Gemini Qualifications/Requirements B. Tech/Masters in Computer Science/Engineering, Electrical/Electronic Engineering. Aeris may conduct background checks to verify the information provided in your application and assess your suitability for the role. The scope and type of checks will comply with the applicable laws and regulations of the country where the position is based. Additional detail will be provided via the formal application process. Aeris walks the walk on diversity. We’re a brilliant mix of varying ethnicities, religions, cultures, sexual orientations, gender identities, ages and professional/personal/military experiences – and that’s by design. Diverse perspectives are essential to our culture, innovative process and competitive edge. Aeris is proud to be an equal opportunity employer.
Posted 3 days ago
10.0 - 13.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Aeris: For more than three decades, Aeris has been a trusted cellular IoT leader enabling the biggest IoT programs and opportunities across Automotive, Utilities and Energy, Fleet Management and Logistics, Medical Devices, and Manufacturing. Our IoT technology expertise serves a global ecosystem of 7,000 enterprise customers and 30 mobile network operator partners, and 80 million IoT devices across the world. Aeris powers today’s connected smart world with innovative technologies and borderless connectivity that simplify management, enhance security, optimize performance, and drive growth. Built from the ground up for IoT and road-tested at scale, Aeris IoT Services are based on the broadest technology stack in the industry, spanning connectivity up to vertical solutions. As veterans of the industry, we know that implementing an IoT solution can be complex, and we pride ourselves on making it simpler. Our company is in an enviable spot. We’re profitable, and both our bottom line and our global reach are growing rapidly. We’re playing in an exploding market where technology evolves daily and new IoT solutions and platforms are being created at a fast pace. A few things to know about us: We put our customers first . When making decisions, we always seek to do what is right for our customer first, our company second, our teams third, and individual selves last We do things differently. As a pioneer in a highly competitive industry that is poised to reshape every sector of the global economy, we cannot fall back on old models. Rather, we must chart our own path and strive to out-innovate, out-learn, out-maneuver and out-pace the competition on the way We walk the walk on diversity. We’re a brilliant and eclectic mix of ethnicities, religions, industry experiences, sexual orientations, generations and more – and that’s by design. We see diverse perspectives as a core competitive advantage Integrity is essential. We believe in doing things well – and doing them right. Integrity is a core value here: you’ll see it embodied in our staff, our management approach and growing social impact work (we have a VP devoted to it). You’ll also see it embodied in the way we manage people and our HR issues: we expect employees and managers to deal with issues directly, immediately and with the utmost respect for each other and for the Company We are owners. Strong managers enable and empower their teams to figure out how to solve problems. You will be no exception, and will have the ownership, accountability and autonomy needed to be truly creative Position Title: Senior Tech Lead Experience Required: 10-13 years Location : Noida Job Description Essential Duties & Responsibilities: Research, design and development of next generation Applications supporting billions of transactions and data Design and development of cloud-based solutions with extensive hand-on experience on Big Data, distributed programming, ETL workflows and orchestration tools Design and development of microservices in Java / J2EE, Node JS, with experience in containerization using Docker, Kubernetes Focus should be on developing cloud native applications utilizing cloud services Work with product managers/owners & internal as well as external customers following Agile methodology Practice rapid iterative product development to mature promising concepts into successful products Execute with a sense of urgency to drive ideas into products through the innovation life-cycle, , demo/evangelize Should be experienced in using GenAI for faster development Skills Required Proven experience of development of high performing, scalable cloud applications using various cloud development stacks & services Proven experience of Containers, GCP, AWS Cloud platforms Deep skills in Java / Python / Node JS / SQL / PLSQL Working experience with Spring boot, ORM, JPA, Transaction Management, Concurrency, Design Patterns Good Understanding of NoSQL databases like MongoDB. Experience on workflow and orchestration tools like NiFi, Airflow would be big plus Deep understanding of best design and software engineering practices design principles and patterns, unit testing, performance engineering Good understanding Distributed Architecture, plug-in and APIs Prior Experience with Security, Cloud, and Container Security possess great advantage Hands-on experience in building applications on various platforms, with deep focus on usability, performance and integration with downstream REST Web services Exposure to Generative AI models, prompt engineering, or integration with APIs like OpenAI, Cohere, or Google Gemini Qualifications/Requirements B. Tech/Masters in Computer Science/Engineering, Electrical/Electronic Engineering Aeris may conduct background checks to verify the information provided in your application and assess your suitability for the role. The scope and type of checks will comply with the applicable laws and regulations of the country where the position is based. Additional detail will be provided via the formal application process. Aeris walks the walk on diversity. We’re a brilliant mix of varying ethnicities, religions, cultures, sexual orientations, gender identities, ages and professional/personal/military experiences – and that’s by design. Diverse perspectives are essential to our culture, innovative process and competitive edge. Aeris is proud to be an equal opportunity employer.
Posted 3 days ago
6.0 - 10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Aeris: For more than three decades, Aeris has been a trusted cellular IoT leader enabling the biggest IoT programs and opportunities across Automotive, Utilities and Energy, Fleet Management and Logistics, Medical Devices, and Manufacturing. Our IoT technology expertise serves a global ecosystem of 7,000 enterprise customers and 30 mobile network operator partners, and 80 million IoT devices across the world. Aeris powers today’s connected smart world with innovative technologies and borderless connectivity that simplify management, enhance security, optimize performance, and drive growth. Built from the ground up for IoT and road-tested at scale, Aeris IoT Services are based on the broadest technology stack in the industry, spanning connectivity up to vertical solutions. As veterans of the industry, we know that implementing an IoT solution can be complex, and we pride ourselves on making it simpler. Our company is in an enviable spot. We’re profitable, and both our bottom line and our global reach are growing rapidly. We’re playing in an exploding market where technology evolves daily and new IoT solutions and platforms are being created at a fast pace. A few things to know about us: We put our customers first . When making decisions, we always seek to do what is right for our customer first, our company second, our teams third, and individual selves last We do things differently. As a pioneer in a highly competitive industry that is poised to reshape every sector of the global economy, we cannot fall back on old models. Rather, we must chart our own path and strive to out-innovate, out-learn, out-maneuver and out-pace the competition on the way We walk the walk on diversity. We’re a brilliant and eclectic mix of ethnicities, religions, industry experiences, sexual orientations, generations and more – and that’s by design. We see diverse perspectives as a core competitive advantage Integrity is essential. We believe in doing things well – and doing them right. Integrity is a core value here: you’ll see it embodied in our staff, our management approach and growing social impact work (we have a VP devoted to it). You’ll also see it embodied in the way we manage people and our HR issues: we expect employees and managers to deal with issues directly, immediately and with the utmost respect for each other and for the Company We are owners. Strong managers enable and empower their teams to figure out how to solve problems. You will be no exception, and will have the ownership, accountability and autonomy needed to be truly creative Position Title: Tech Lead Experience Required: 6-10 years Job Description Essential Duties & Responsibilities: Research, design and development of next generation Applications supporting billions of transactions and data. Design and development of cloud-based solutions with extensive hand-on experience on Big Data, distributed programming, ETL workflows and orchestration tools. Design and development of microservices in Java / J2EE, Node JS, with experience in containerization using Docker, Kubernetes Focus should be on developing cloud native applications utilizing cloud services. Work with product managers/owners & internal as well as external customers following Agile methodology Practice rapid iterative product development to mature promising concepts into successful products. Execute with a sense of urgency to drive ideas into products through the innovation life-cycle, , demo/evangelize. Should be experienced in using GenAI for faster development Skills Required Proven experience of development of high performing, scalable cloud applications using various cloud development stacks & services. Proven experience of Containers, GCP, AWS Cloud platforms. Deep skills in Java / Python / Node JS / SQL / PLSQL Working experience with Spring boot, ORM, JPA, Transaction Management, Concurrency, Design Patterns. Good Understanding of NoSQL databases like MongoDB. Experience on workflow and orchestration tools like NiFi, Airflow would be big plus. Deep understanding of best design and software engineering practices design principles and patterns, unit testing, performance engineering. Good understanding Distributed Architecture, plug-in and APIs. Prior Experience with Security, Cloud, and Container Security possess great advantage. Hands-on experience in building applications on various platforms, with deep focus on usability, performance and integration with downstream REST Web services. Exposure to Generative AI models, prompt engineering, or integration with APIs like OpenAI, Cohere, or Google Gemini Qualifications/Requirements B. Tech/Masters in Computer Science/Engineering, Electrical/Electronic Engineering. Aeris may conduct background checks to verify the information provided in your application and assess your suitability for the role. The scope and type of checks will comply with the applicable laws and regulations of the country where the position is based. Additional detail will be provided via the formal application process. Aeris walks the walk on diversity. We’re a brilliant mix of varying ethnicities, religions, cultures, sexual orientations, gender identities, ages and professional/personal/military experiences – and that’s by design. Diverse perspectives are essential to our culture, innovative process and competitive edge. Aeris is proud to be an equal opportunity employer.
Posted 3 days ago
6.0 - 15.0 years
0 Lacs
karnataka
On-site
You have a unique opportunity to join as an Integration Architect specializing in Apache Nifi and Kubernetes. Your primary responsibilities will involve developing and performing automated builds, testing, and deployments in conjunction with Apache Nifi v2. This role requires a minimum of 15+ years of relevant experience and an in-depth understanding of various technologies and tools. Your expertise should include proficiency in the Linux CLI, extensive knowledge of SQL, and familiarity with ServiceNow CMDB. You must possess a good grasp of security principles, particularly OAuth basic/2.0, and IP networking concepts such as TCP, UDP, DNS, DHCP, firewalls, and IP routing. Additionally, experience with web services using SOAP/REST API, scripting languages like Bash/RegEx/Python/Groovy, and Java programming for custom code creation is essential. As an Integration Architect, you will be expected to build data integration workflows using NiFi, NiFi registry, and custom NiFi processors. Performance tuning of NiFi processing, working with Apache Kafka, and following Agile methodology are also crucial aspects of the role. Your responsibilities will extend to designing, deploying, and managing Kubernetes clusters, infrastructure-as-code tools like Crossplane, and container orchestration. Proficiency in GitOps practices, container monitoring/logging tools, networking principles, and identity/access management tools is highly desirable. You will play a pivotal role in maintaining Kubernetes clusters for open-source applications, implementing GitOps continuous delivery with ArgoCD, managing cloud resources with Crossplane API, and ensuring secure access with Keycloak. Your expertise in secrets management, API gateway management, persistent storage solutions, and certificate management will be invaluable for the organization. Furthermore, implementing security best practices, documenting procedures, and contributing to open-source projects are key elements of this dynamic role. The preferred qualifications for this position include a Bachelor's degree in computer science or a related field, Kubernetes certification, and knowledge of software-defined networking solutions for Kubernetes. Your soft skills, such as effective communication, stakeholder management, taking ownership, and autonomy, will be essential in leading technical discussions and resolving issues effectively. If you are passionate about integration architecture, possess a strong technical background, and are eager to work in a collaborative environment, this role offers a challenging yet rewarding opportunity to showcase your skills and contribute to cutting-edge projects in a fast-paced setting.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for developing scalable web applications using Python (FastAPI), React.js, and cloud-native technologies. Specifically, you will work on building a low-code/no-code AI agent platform, designing an intuitive workflow UI, and integrating with LLMs, enterprise connectors, and role-based access controls. As a Full-Stack Developer, your responsibilities will include developing and optimizing APIs using FastAPI, integrating with LangChain, Pinecone/Weaviate vector databases, and enterprise connectors like Airbyte/Nifi for backend development. For frontend development, you will build an interactive drag-and-drop workflow UI using React.js along with supporting libraries like React Flow, D3.js, and TailwindCSS. You will also be tasked with implementing authentication mechanisms such as OAuth2, Keycloak, and role-based access controls for multi-tenant environments. Database design will involve working with PostgreSQL for structured data, MongoDB for unstructured data, and Neo4j for knowledge graphs. Your role will extend to DevOps and deployment using Docker, Kubernetes, and Terraform across various cloud platforms like Azure, AWS, and GCP. Performance optimization will be crucial as you strive to enhance API performance and frontend responsiveness for an improved user experience. Collaboration with AI and Data Engineers will be essential to ensure seamless integration of AI models. To excel in this role, you should have at least 5 years of experience in FastAPI, React.js, and cloud-native applications. A strong understanding of REST APIs, GraphQL, and WebSockets is required. Experience with JWT authentication, OAuth2, and multi-tenant security is essential. Proficiency in databases such as PostgreSQL, MongoDB, Neo4j, and Redis is expected. Knowledge of workflow automation tools like n8n, Node-RED, and Temporal.io will be beneficial. Familiarity with containerization tools (Docker, Kubernetes) and CI/CD pipelines is preferred. Any experience with Apache Kafka, WebSockets, or AI-driven chatbots would be considered a bonus.,
Posted 3 days ago
0 years
0 Lacs
India
On-site
Job Summary: We are looking for a skilled Senior Data Engineer with strong expertise in Spark and Scala on the AWS platform. The ideal candidate should possess excellent problem-solving skills and hands-on experience in Spark-based data processing within a cloud-based ecosystem. This role offers the opportunity to independently execute diverse and complex engineering tasks, demonstrate a solid understanding of the end-to-end software development lifecycle, and collaborate effectively with stakeholders to deliver high-quality technical solutions. Key Responsibilities: Develop, analyze, debug, and enhance Spark-Scala programs. Work on Spark batch processing jobs, with the ability to analyze/debug using Spark UI and logs. Optimize performance of Spark applications and ensure scalability and reliability. Manage data processing tasks using AWS S3, AWS EMR clusters, and other AWS services. Leverage Hadoop ecosystem tools including HDFS, HBase, Hive, and MapReduce. Write efficient and optimized SQL queries; experience with PostgreSQL and Couchbase or similar databases is preferred. Utilize orchestration tools such as Kafka, NiFi, and Oozie. Work with monitoring tools like Dynatrace and CloudWatch. Contribute to the creation of High-Level Design (HLD) and Low-Level Design (LLD) documents and participate in reviews with architects. Support development and lower environments setup, including local IDE configuration. Follow defined coding standards, best practices, and quality processes. Collaborate using Agile methodologies for development, review, and delivery. Use supplementary programming languages like Python as needed. Required Skills: Mandatory: Apache Spark Scala Big Data Hadoop Ecosystem Spark SQL Additional Preferred Skills: Spring Core Framework Core Java, Hibernate, Multithreading AWS EMR, S3, CloudWatch HDFS, HBase, Hive, MapReduce PostgreSQL, Couchbase Kafka, NiFi, Oozie Dynatrace or other monitoring tools Python (as supplementary language) Agile Methodology
Posted 3 days ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team designs, develops, and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Find your future at United! We’re reinventing what our industry looks like, and what an airline can be – from the planes we fly to the people who fly them. When you join us, you’re joining a global team of 100,000+ connected by a shared passion with a wide spectrum of experience and skills to lead the way forward. Achieving our ambitions starts with supporting yours. Evolve your career and find your next opportunity. Get the care you need with industry-leading health plans and best-in-class programs to support your emotional, physical, and financial wellness. Expand your horizons with travel across the world’s biggest route network. Connect outside your team through employee-led Business Resource Groups. Create what’s next with us. Let’s define tomorrow together. Job Overview And Responsibilities Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial and operational projects with a digital focus. Data Engineer will be responsible to partner with various teams to define and execute data acquisition, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth Design, develop, and implement streaming and near-real time data pipelines that feed systems that are the operational backbone of our business Execute unit tests and validating expected results to ensure accuracy & integrity of data and applications through analysis, coding, writing clear documentation and problem resolution This role will also drive the adoption of data processing and analysis within the Hadoop environment and help cross train other members of the team Leverage strategic and analytical skills to understand and solve customer and business centric questions Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies and partners Leverage data from a variety of sources to develop data marts and insights that provide a comprehensive understanding of the business Develop and implement innovative solutions leading to automation Use of Agile methodologies to manage projects Mentor and train junior engineers This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. Qualifications What’s needed to succeed (Minimum Qualifications): BS/BA, in computer science or related STEM field 2+ years of IT experience in software development 2+ years of development experience using Java, Python, Scala 2+ years of experience with Big Data technologies like PySpark, Hadoop, Hive, HBASE, Kafka, Nifi 2+ years of experience with relational database systems like MS SQL Server, Oracle, Teradata Creative, driven, detail-oriented individuals who enjoy tackling tough problems with data and insights Individuals who have a natural curiosity and desire to solve problems are encouraged to apply Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English and Hindi (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Masters in computer science or related STEM field Experience with cloud based systems like AWS, AZURE or Google Cloud Certified Developer / Architect on AWS Strong experience with continuous integration & delivery using Agile methodologies Data engineering experience with transportation/airline industry Strong problem-solving skills Strong knowledge in Big Data
Posted 3 days ago
10.0 years
0 Lacs
Telangana
On-site
About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Reporting to the VP COG ECM enterprise Forms Portfolio Delivery Manager, this role will be responsible for managing and supporting Implementation of a new Document solution for identified applications with the CCM landscape, in APAC. OpenText xPression and Duckcreek has been the corporate document generation tool of choice within Chubb. But xPression going end of life and be unsupported from 2025. A new Customer Communications Management (CCM) platform – Quadient Inspire - has been selected to replace xPression by a global working group and implementation of this new tool (including migration of existing forms/templates from xPression where applicable). Apart from migrating from xPression, there are multiple existing applications to be replaced with Quadient Inspire The role is based in Hyderabad/India with some travel to other Chubb offices. Although there are no direct line management responsibilities within this role, the successful applicant will be responsible for task management of Business Analysts and an Onshore/Offshore development team. The role will require the ability to manage multiple project/enhancement streams with a variety of levels of technical/functional scope and across a number of different technologies. In this role, you will: Lead the design and development of comprehensive data engineering frameworks and patterns. Establish engineering design standards and guidelines for the creation, usage, and maintenance of data across COG (Chubb overseas general) Derive innovation and build highly scalable real-time data pipelines and data platforms to support the business needs. Act as mentor and lead for the data engineering organization that is business-focused, proactive, and resilient. Promote data governance and master/reference data management as a strategic discipline. Implement strategies to monitor the effectiveness of data management. Be an engineering leader and coach data engineers and be an active member of the data leadership team. Evaluate emerging data technologies and determine their business benefits and impact on the future-state data platform. Develop and promote a strong data management framework, emphasizing data quality, governance, and compliance with regulatory requirements Collaborate with Data Modelers to create data models (conceptual, logical, and physical) Architect meta-data management processes to ensure data lineage, data definitions, and ownership are well-documented and understood Collaborate closely with business leaders, IT teams, and external partners to understand data requirements and ensure alignment with strategic goals Act as a primary point of contact for data engineering discussions and inquiries from various stakeholders Lead the implementation of data architectures on cloud platforms (AWS, Azure, Google Cloud) to improve efficiency and scalability Qualifications Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field; Master’s degree preferred Minimum of 10 years’ experience in data architecture or data engineering roles, with a significant focus in P&C insurance domains preferred. Proven track record of successful implementation of data architecture within large-scale transformation programs or projects Comprehensive knowledge of data modelling techniques and methodologies, including data normalization and denormalization practices Hands on expertise across a wide variety of database (Azure SQL, MongoDB, Cosmos), data transformation (Informatica IICS, Databricks), change data capture and data streaming (Apache Kafka, Apache Flink) technologies Proven Expertise with data warehousing concepts, ETL processes, and data integration tools (e.g., Informatica, Databricks, Talend, Apache Nifi) Experience with cloud-based data architectures and platforms (e.g., AWS Redshift, Google BigQuery, Snowflake, Azure SQL Database) Expertise in ensuring data security patterns (e.g. tokenization, encryption, obfuscation) Knowledge of insurance policy operations, regulations, and compliance frameworks specific to Consumer lines Familiarity with Agile methodologies and experience working in Agile project environments Understanding of advanced analytics, AI, and machine learning concepts as they pertain to data architecture Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers
Posted 3 days ago
10.0 years
0 Lacs
Telangana
On-site
About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Reporting to the VP COG ECM enterprise Forms Portfolio Delivery Manager, this role will be responsible for managing and supporting Implementation of a new Document solution for identified applications with the CCM landscape, in APAC. OpenText xPression and Duckcreek has been the corporate document generation tool of choice within Chubb. But xPression going end of life and be unsupported from 2025. A new Customer Communications Management (CCM) platform – Quadient Inspire - has been selected to replace xPression by a global working group and implementation of this new tool (including migration of existing forms/templates from xPression where applicable). Apart from migrating from xPression, there are multiple existing applications to be replaced with Quadient Inspire The role is based in Hyderabad/India with some travel to other Chubb offices. Although there are no direct line management responsibilities within this role, the successful applicant will be responsible for task management of Business Analysts and an Onshore/Offshore development team. The role will require the ability to manage multiple project/enhancement streams with a variety of levels of technical/functional scope and across a number of different technologies. In this role, you will: Lead the design and development of comprehensive data engineering frameworks and patterns. Establish engineering design standards and guidelines for the creation, usage, and maintenance of data across COG (Chubb overseas general) Derive innovation and build highly scalable real-time data pipelines and data platforms to support the business needs. Act as mentor and lead for the data engineering organization that is business-focused, proactive, and resilient. Promote data governance and master/reference data management as a strategic discipline. Implement strategies to monitor the effectiveness of data management. Be an engineering leader and coach data engineers and be an active member of the data leadership team. Evaluate emerging data technologies and determine their business benefits and impact on the future-state data platform. Develop and promote a strong data management framework, emphasizing data quality, governance, and compliance with regulatory requirements Collaborate with Data Modelers to create data models (conceptual, logical, and physical) Architect meta-data management processes to ensure data lineage, data definitions, and ownership are well-documented and understood Collaborate closely with business leaders, IT teams, and external partners to understand data requirements and ensure alignment with strategic goals Act as a primary point of contact for data engineering discussions and inquiries from various stakeholders Lead the implementation of data architectures on cloud platforms (AWS, Azure, Google Cloud) to improve efficiency and scalability Qualifications Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field; Master’s degree preferred Minimum of 10 years’ experience in data architecture or data engineering roles, with a significant focus in P&C insurance domains preferred. Proven track record of successful implementation of data architecture within large-scale transformation programs or projects Comprehensive knowledge of data modelling techniques and methodologies, including data normalization and denormalization practices Hands on expertise across a wide variety of database (Azure SQL, MongoDB, Cosmos), data transformation (Informatica IICS, Databricks), change data capture and data streaming (Apache Kafka, Apache Flink) technologies Proven Expertise with data warehousing concepts, ETL processes, and data integration tools (e.g., Informatica, Databricks, Talend, Apache Nifi) Experience with cloud-based data architectures and platforms (e.g., AWS Redshift, Google BigQuery, Snowflake, Azure SQL Database) Expertise in ensuring data security patterns (e.g. tokenization, encryption, obfuscation) Knowledge of insurance policy operations, regulations, and compliance frameworks specific to Consumer lines Familiarity with Agile methodologies and experience working in Agile project environments Understanding of advanced analytics, AI, and machine learning concepts as they pertain to data architecture Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers
Posted 3 days ago
0 years
0 Lacs
India
Remote
Role: NIFI Developer Notice period: Notice Serving Candidates or Immediate Joiners Preferred Client: Marriott Payroll: Dminds Work Mode: Remote I nterview Mode: Virtual We’re looking for someone who has built deployed and maintained NIFI clusters. Roles & Responsibilities: ·Implemented solutions utilizing Advanced AWS Components: EMR, EC2, etc integrated with Big Data/Hadoop Distribution Frameworks: Zookeeper, Yarn, Spark, Scala, NiFi etc. ·Designed and Implemented Spark Jobs to be deployed and run on existing Active clusters. ·Configured Postgres Database on EC2 instances and made sure application that was created is up and running, Trouble Shooted issues to meet the desired application state. ·Experience in creating and configuring secure VPC, Subnets, and Security Groups through private and public networks. ·Created alarms, alerts, notifications for Spark Jobs to email and slack group message job status and log in CloudWatch. ·NiFi data Pipeline to process large set of data and configured Lookup’s for Data Validation and Integrity. ·generation large set of test data with data integrity using java which used in Development and QA Phase. ·Spark Scala, improving the performance and optimized of the existing applications running on EMR cluster. ·Spark Job to Convert CSV data to Custom HL7/FHIR objects using FHIR API’s. ·Deployed SNS, SQS, Lambda function, IAM Roles, Custom Policies, EMR with Spark and Hadoop setup and bootstrap scripts to setup additional software’s needed to perform the job in QA and Production Environment using Terraform Scripts. ·Spark Job to perform Change Data Capture (CDC) on Postgres Tables and updated target tables using JDBC properties. ·Kafka Publisher integrated in spark job to capture errors from Spark Application and push into Postgres table. ·extensively on building Nifi data pipelines in docker container environment in development phase. ·Devops team to Clusterize NIFI Pipeline on EC2 nodes integrated with Spark, Kafka, Postgres running on other instances using SSL handshakes in QA and Production Environments.
Posted 3 days ago
2.0 - 9.0 years
0 Lacs
karnataka
On-site
We are seeking a Data Architect / Sr. Data and Pr. Data Architects to join our team. In this role, you will be involved in a combination of hands-on contribution, customer engagement, and technical team management. As a Data Architect, your responsibilities will include designing, architecting, deploying, and maintaining solutions on the MS Azure platform using various Cloud & Big Data Technologies. You will be managing the full life-cycle of Data Lake / Big Data solutions, starting from requirement gathering and analysis to platform selection, architecture design, and deployment. It will be your responsibility to implement scalable solutions on the Cloud and collaborate with a team of business domain experts, data scientists, and application developers to develop Big Data solutions. Moreover, you will be expected to explore and learn new technologies for creative problem solving and mentor a team of Data Engineers. The ideal candidate should possess strong hands-on experience in implementing Data Lake with technologies such as Data Factory (ADF), ADLS, Databricks, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Additionally, experience with big data technologies like Hadoop (CDH or HDP), Spark, Airflow, NiFi, Kafka, Hive, HBase, MongoDB, Neo4J, Elastic Search, Impala, Sqoop, etc., is required. Proficiency in programming and debugging skills in Python and Scala/Java is essential, with experience in building REST services considered beneficial. Candidates should also have experience in supporting BI and Data Science teams in consuming data in a secure and governed manner, along with a good understanding of using CI/CD with Git, Jenkins / Azure DevOps. Experience in setting up cloud-computing infrastructure solutions, hands-on experience/exposure to NoSQL Databases, and Data Modelling in Hive are all highly valued. Applicants should have a minimum of 9 years of technical experience, with at least 5 years on MS Azure and 2 years on Hadoop (CDH/HDP).,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Snowflake Data Engineer at our organization, you will play a vital role in designing, developing, and maintaining our data infrastructure. Your responsibilities will include ingesting, transforming, and distributing data using Snowflake and AWS technologies. You will collaborate with various stakeholders to ensure efficient data pipelines and secure data operations. Your key responsibilities will involve designing and implementing data pipelines using Snowflake and AWS technologies. You will leverage tools like SnowSQL, Snowpipe, NiFi, Matillion, and DBT to ingest, transform, and automate data integration processes. Implementing role-based access controls and managing AWS resources will be crucial for ensuring data security and supporting Snowflake operations. Additionally, you will be responsible for optimizing Snowflake queries and data models for performance and scalability. To excel in this role, you should have a strong proficiency in SQL and Python, along with hands-on experience with Snowflake and AWS services. Understanding ETL/ELT tools, data warehousing concepts, and data quality techniques will be essential. Your analytical skills, problem-solving abilities, and excellent communication skills will enable you to collaborate effectively with data analysts, data scientists, and other team members. Preferred skills include experience with data virtualization, machine learning, AI concepts, data governance, and data security best practices. Staying updated with the latest advancements in Snowflake and AWS technologies will be essential for this role. If you are a passionate and experienced Snowflake Data Engineer with 5 to 7 years of experience, we invite you to apply and be a part of our team. This is a full-time position based in Gurgaon, with a hybrid work mode accommodating India, UK, and US work shifts.,
Posted 4 days ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Big Data Engineer (AWS-Scala Specialist) Location: Greater Noida/Hyderabad Experience: 5-10 Years About the Role- We are seeking a highly skilled Senior Big Data Engineer with deep expertise in Big Data technologies and AWS Cloud Services. The ideal candidate will bring strong hands-on experience in designing, architecting, and implementing scalable data engineering solutions while driving innovation within the team. Key Responsibilities- Design, develop, and optimize Big Data architectures leveraging AWS services for large-scale, complex data processing. Build and maintain data pipelines using Spark (Scala) for both structured and unstructured datasets. Architect and operationalize data engineering and analytics platforms (AWS preferred; Hortonworks, Cloudera, or MapR experience a plus). Implement and manage AWS services including EMR, Glue, Kinesis, DynamoDB, Athena, CloudFormation, API Gateway, and S3. Work on real-time streaming solutions using Kafka and AWS Kinesis. Support ML model operationalization on AWS (deployment, scheduling, and monitoring). Analyze source system data and data flows to ensure high-quality, reliable data delivery for business needs. Write highly efficient SQL queries and support data warehouse initiatives using Apache NiFi, Airflow, and Kylo. Collaborate with cross-functional teams to provide technical leadership, mentor team members, and strengthen the data engineering capability. Troubleshoot and resolve complex technical issues, ensuring scalability, performance, and security of data solutions. Mandatory Skills & Qualifications- ✅ 5+ years of solid hands-on experience in Big Data Technologies (AWS, Scala, Hadoop and Spark Mandatory) ✅ Proven expertise in Spark with Scala ✅ Hands-on experience with: AWS services (EMR, Glue, Lambda, S3, CloudFormation, API Gateway, Athena, Lake Formation) Share your resume at Aarushi.Shukla@coforge.com if you have experience with mandatory skills and you are an early.
Posted 4 days ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Manager Software Engineer Overview We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Our Team Within Mastercard – Data & Services The Data & Services team is a key differentiator for Mastercard, providing the cutting-edge services that are used by some of the world's largest organizations to make multi-million dollar decisions and grow their businesses. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business Test & Learn experimentation, and data-driven information and risk management services. Targeting Analytics Program Within the D&S Technology Team, the Targeting Analytics program is a relatively new program that is comprised of a rich set of products that provide accurate perspectives on Credit Risk, Portfolio Optimization, and Ad Insights. Currently, we are enhancing our customer experience with new user interfaces, moving to API-based data publishing to allow for seamless integration in other Mastercard products and externally, utilizing new data sets and algorithms to further analytic capabilities, and generating scalable big data processes. We are seeking an innovative Lead Software Engineer to lead our team in designing and building a full stack web application and data pipelines. The goal is to deliver custom analytics efficiently, leveraging machine learning and AI solutions. This individual will thrive in a fast-paced, agile environment and partner closely with other areas of the business to build and enhance solutions that drive value for our customers. Engineers work in small, flexible teams. Every team member contributes to designing, building, and testing features. The range of work you will encounter varies from building intuitive, responsive UIs to designing backend data models, architecting data flows, and beyond. There are no rigid organizational structures, and each team uses processes that work best for its members and projects. Here are a few examples of products in our space: Portfolio Optimizer (PO) is a solution that leverages Mastercard’s data assets and analytics to allow issuers to identify and increase revenue opportunities within their credit and debit portfolios. Audiences uses anonymized and aggregated transaction insights to offer targeting segments that have high likelihood to make purchases within a category to allow for more effective campaign planning and activation. Credit Risk products are a new suite of APIs and tooling to provide lenders real-time access to KPIs and insights serving thousands of clients to make smarter risk decisions using Mastercard data. Help found a new, fast-growing engineering team! Position Responsibilities As a Lead Software Engineer, you will: Lead the scoping, design and implementation of complex features Lead and push the boundaries of analytics and powerful, scalable applications Design and implement intuitive, responsive UIs that allow issuers to better understand data and analytics Build and maintain analytics and data models to enable performant and scalable products Ensure a high-quality code base by writing and reviewing performant, well-tested code Mentor junior software engineers and teammates Drive innovative improvements to team development processes Partner with Product Managers and Customer Experience Designers to develop a deep understanding of users and use cases and apply that knowledge to scoping and building new modules and features Collaborate across teams with exceptional peers who are passionate about what they do Ideal Candidate Qualifications 10+ years of engineering experience in an agile production environment. Experience leading the design and implementation of complex features in full-stack applications. Proficiency with object-oriented languages, preferably Java/ Spring. Proficiency with modern front-end frameworks, preferably React with Redux, Typescript. High proficiency in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi, Scoop) Fluent in the use of Git, Jenkins. Solid experience with RESTful APIs and JSON/SOAP based API Solid experience with SQL, Multi-threading, Message Queuing. Experience in building and deploying production-level data-driven applications and data processing workflows/pipelines and/or implementing machine learning systems at scale in Java, Scala, or Python and deliver analytics involving all phases. Desirable Capabilities Hands on experience of cloud native development using microservices. Hands on experience on Kafka, Zookeeper. Knowledge of Security concepts and protocol in enterprise application. Expertise with automated E2E and unit testing frameworks. Knowledge of Splunk or other alerting and monitoring solutions. Core Competencies Strong technologist eager to learn new technologies and frameworks. Experience coaching and mentoring junior teammates. Customer-centric development approach Passion for analytical / quantitative problem solving Ability to identify and implement improvements to team development processes Strong collaboration skills with experience collaborating across many people, roles, and geographies Motivation, creativity, self-direction, and desire to thrive on small project teams Superior academic record with a degree in Computer Science or related technical field Strong written and verbal English communication skills #AI3 Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 4 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description and Requirements "At BMC trust is not just a word - it's a way of life!" Hybrid Description and Requirements "At BMC trust is not just a word - it's a way of life!" We are an award-winning, equal opportunity, culturally diverse, fun place to be. Giving back to the community drives us to be better every single day. Our work environment allows you to balance your priorities, because we know you will bring your best every day. We will champion your wins and shout them from the rooftops. Your peers will inspire, drive, support you, and make you laugh out loud! We help our customers free up time and space to become an Autonomous Digital Enterprise that conquers the opportunities ahead - and are relentless in the pursuit of innovation! The IZOT product line includes BMC’s Intelligent Z Optimization & Transformation products, which help the world’s largest companies to monitor and manage their mainframe systems. The modernization of mainframe is the beating heart of our product line, and we achieve this goal by developing products that improve the developer experience, the mainframe integration, the speed of application development, the quality of the code and the applications’ security, while reducing operational costs and risks. We acquired several companies along the way, and we continue to grow, innovate, and perfect our solutions on an ongoing basis. BMC is looking for a Product Owner to join our amazing team! The BMC AMI Cloud Analytics product can quickly transfer, transform, and integrate mainframe data so it could be shared with the organizational data lake to be used by artificial intelligence, machine learning (AI/ML) and analytics solutions. In this role, you will lead the transformation of this cutting-edge product originally developed by Model9, a startup acquired by BMC, into a solution designed to meet the rigorous demands of enterprise customers. This exciting opportunity combines innovation, scalability, and leadership, giving you a chance to shape the product’s evolution as it reaches new heights in enterprise markets. You’ll analyze business opportunities, specify and prioritize customer requirements, and guide product development teams to deliver cutting-edge solutions that resonate with global B2B customers. As a product owner, you will be or become an expert on the product, market, and related business domains. Here is how, through this exciting role, YOU will contribute to BMC's and your own success: Lead the transformation of a startup-level solution from Model9 into a robust enterprise-grade product, addressing the complex needs of global organizations. Collaborate with engineering and QA teams to ensure technical feasibility, resolve roadblocks, and deliver solutions that align with customer needs Help plan product deliveries, including documenting detailed requirements, scheduling releases, and publishing roadmaps. Maintaining a strategic backlog of prioritized features. Drive cross-functional collaboration across development, QA, product management, and support teams to ensure seamless product delivery and customer satisfaction. Distil complex business and technical requirements into clear, concise PRD's and prioritized feature backlogs. To ensure you’re set up for success, you will bring the following skillset & experience: 3+ years of software product owner experience in an enterprise/B2B software company, including experience working with global B2B customers Solid technical background (preferably previous experience as a developer or QA) Deep familiarity with public cloud services and storage services (AWS EC2/FSx/EFS/EBS/S3, RDS, Aurora, etc.,) Strong understanding of ETL/ELT solutions and data transformation techniques Knowledge of modern data Lakehouse architectures (e.g., Databricks, Snowflake). B.Sc. in a related field (preferably Software Engineering or similar) or equivalent Experience leading new products and product features through ideation, research, planning, development, go-to-market and feedback cycles Fluent English, spoken and written. Willingness to travel, typically 1-2 times a quarter Whilst these are nice to have, our team can help you develop in the following skills: Background as DBA or system engineer with hands-on experience with commercial and open-source databases like MSSQL, Oracle, PostgreSQL, etc. Knowledge / experience of agile methods (especially lean); familiarity with Aha!, Jira, Confluence. Experience with ETL/ELT tools (e.g., Apache NiFi, Qlik, Precisely, Informatica, Talend, AWS Glue, Azure Data Factory). Understanding of programming languages commonly used on z/OS, such as COBOL, PL/I, REXX, and assembler. Understanding of z/OS subsystems such as JES2/JES3, RACF, DB2, CICS, MQ, and IMS. Experien ce in Cloud-based products and technologies (containerization, serverless approaches, vendor-specific cloud services, cloud security) CA-DNP Our commitment to you! BMC’s culture is built around its people. We have 6000+ brilliant minds working together across the globe. You won’t be known just by your employee number, but for your true authentic self. BMC lets you be YOU! If after reading the above, You’re unsure if you meet the qualifications of this role but are deeply excited about BMC and this team, we still encourage you to apply! We want to attract talents from diverse backgrounds and experience to ensure we face the world together with the best ideas! BMC is committed to equal opportunity employment regardless of race, age, sex, creed, color, religion, citizenship status, sexual orientation, gender, gender expression, gender identity, national origin, disability, marital status, pregnancy, disabled veteran or status as a protected veteran. If you need a reasonable accommodation for any part of the application and hiring process, visit the accommodation request page. BMC Software maintains a strict policy of not requesting any form of payment in exchange for employment opportunities, upholding a fair and ethical hiring process. At BMC we believe in pay transparency and have set the midpoint of the salary band for this role at 2,790,000 INR. Actual salaries depend on a wide range of factors that are considered in making compensation decisions, including but not limited to skill sets; experience and training, licensure, and certifications; and other business and organizational needs. The salary listed is just one component of BMC's employee compensation package. Other rewards may include a variable plan and country specific benefits. We are committed to ensuring that our employees are paid fairly and equitably, and that we are transparent about our compensation practices. ( Returnship@BMC ) Had a break in your career? No worries. This role is eligible for candidates who have taken a break in their career and want to re-enter the workforce. If your expertise matches the above job, visit to https://bmcrecruit.avature.net/returnship know more and how to apply.
Posted 4 days ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY-Consulting - Data and Analytics – Senior - Clinical Integration Developer EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Clinical Trials Integration Developers with 5+ years of experience in software development within the life sciences domain to support the integration of Medidata’s clinical trial systems across the Client R&D environment. This role offers the chance to build robust, compliant integration solutions, contribute to the design of clinical data workflows, and ensure interoperability across critical clinical applications. You will collaborate closely with business and IT teams, playing a key role in enhancing data flow, supporting trial operations, and driving innovation in clinical research. Your Key Responsibilities Design and implement integration solutions to connect Medidata clinical trial systems with other applications within the clinical data landscape. Develop and configure system interfaces using programming languages (e.g., Java, Python, C#) or integration middleware tools (e.g., Informatica, AWS, Apache NiFi). Collaborate with clinical business stakeholders and IT teams to gather requirements, define technical specifications, and ensure interoperability. Create and maintain integration workflows and data mappings that align with clinical trial data standards (e.g., CDISC, SDTM, ADaM). Ensure all development and implementation activities comply with GxP regulations and are aligned with validation best practices. Participate in agile development processes, including sprint planning, code reviews, testing, and deployment. Troubleshoot and resolve integration-related issues, ensuring stable and accurate data flow across systems. Document integration designs, workflows, and technical procedures to support long-term maintainability. Contribute to team knowledge sharing and continuous improvement initiatives within the integration space. Skills And Attributes For Success Apply a hands-on, solution-driven approach to implement integration workflows using code or middleware tools within clinical data environments. Strong communication and problem-solving skills with the ability to collaborate effectively with both technical and clinical teams. Ability to understand and apply clinical data standards and validation requirements when developing system integrations. To qualify for the role, you must have Experience: Minimum 5 years in software development within the life sciences domain, preferably in clinical trial management systems. Education: Must be a graduate preferrable BE/B.Tech/BCA/Bsc IT Technical Skills: Proficiency in programming languages such as Java, Python, or C#, and experience with integration middleware like Informatica, AWS, or Apache NiFi; strong background in API-based system integration. Domain Knowledge: Solid understanding of clinical trial data standards (e.g., CDISC, SDTM, ADaM) and data management processes; experience with agile methodologies and GxP-compliant development environments. Soft Skills: Strong problem-solving abilities, clear communication, and the ability to work collaboratively with clinical and technical stakeholders. Additional Attributes: Capable of implementing integration workflows and mappings, with attention to detail and a focus on delivering compliant and scalable solutions. Ideally, you’ll also have Hands-on experience with ETL tools and clinical data pipeline orchestration frameworks relevant to clinical research. Hands-on experience with clinical R&D platforms such as Oracle Clinical, Medidata RAVE, or other EDC systems. Proven experience leading small integration teams and engaging with cross-functional stakeholders in regulated (GxP) environments. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Apache NiFi is a robust data ingestion and integration tool that is widely used in the tech industry. The job market for NiFi professionals in India is currently thriving, with a high demand for skilled individuals who can work with this powerful tool. If you are a job seeker looking to explore opportunities in the NiFi space, this article is for you.
Here are the top 5 major cities in India actively hiring for NiFi roles: - Bangalore - Pune - Hyderabad - Chennai - Mumbai
The salary range for NiFi professionals in India varies based on experience levels. On average, entry-level NiFi developers can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.
A typical career path in the NiFi space may look like this: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect
In addition to NiFi expertise, professionals in this field are often expected to have skills in: - Apache Kafka - Hadoop - Spark - Java - SQL
Here are 25 interview questions you may encounter when applying for NiFi roles:
As you embark on your journey to explore NiFi jobs in India, remember to prepare thoroughly and showcase your skills confidently during interviews. With the right expertise and determination, you can build a successful career in this dynamic field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough