Jobs
Interviews

10565 Apache Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

0 Lacs

Rajkot, Gujarat, India

On-site

Setting up LAMP environment and hosting and configuring php projects. Troubleshooting network, processes, disk related issues. Installation and configuration of Samba and NFS. Setting up routers/modems and linux firewalls (iptables). Configuring MySql replication and clusters. Installation and configuration of PXE server for installing Linux over the Network. Technical writing skills producing clear and unambiguous technical documentation and user stories. Writing shell script to automate the tasks. Developing spiders using Scrapy framework (python) for crawling and scrapping the web. Troubleshooting issues of php scripts and supporting web developers to develop websites. Experience of performance tuning on apache server. Experience of testing server configuration and website using different testing tools. i.e. ab and siege. Position : 01 Required Experience : 6 months 2 years Technical Skills : Installation and configuration of Linux servers, troubleshooting network, processes, disk related issues. Apply Now

Posted 2 days ago

Apply

2.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Analyze information and make evaluative judgements to recommend solutions and improvements Conduct testing and debugging, utilize script tools, and write basic code for design specifications Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 2-4 Years of Software development experience- Analyzing and extracting data from Oracle databases using SQL and PL/SQL scripts. Support the development and execution of data migration jobs from Oracle to Snowflake. Write and optimize SQL queries for data transformation, cleansing, and validation. Work with ETL tools such as Apache Spark, Python, Abinitio. Collaborate with senior developers to ensure data quality, integrity, and reconciliation. Participate in testing phases, including data validation and post-migration QA. Document data mapping, transformation logic, and workflows. Support performance tuning and troubleshooting of migration jobs. Participate in code reviews and follow established development practices and Support application deployments using CI/CD tools Contribute to documentation and system specifications Collaborate in Agile rituals, code reviews, and team demos Test Driven development and automated testing tools like JUnit, Cucumber/ Jasmine; JIRA, Gradle, Sonar. Working with Cloud platforms for deployment and AI based engineering tools. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 days ago

Apply

2.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements Identify and analyze issues, make recommendations, and implement solutions Utilize knowledge of business processes, system processes, and industry standards to solve complex issues Analyze information and make evaluative judgements to recommend solutions and improvements Conduct testing and debugging, utilize script tools, and write basic code for design specifications Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 2-4 Years of Software development experience- Develop and maintain banking applications using Java, SpringBoot, Angular, JPA frameworks. Work with databases (Oracle, MongoDB) for transaction processing or Snowflake for analytical data. Experience in migrating ETL/ELT jobs from Ab initio to Apache Spark jobs. Participate in code reviews and follow established development practices and Support application deployments using CI/CD tools. Strong experience in writing and optimizing SQL or PL/SQL query. Learn and implement TDD practices under senior developer guidance Contribute to documentation and system specifications Collaborate in Agile rituals, code reviews, and team demos Test Driven development and automated testing tools like JUnit, Cucumber/ Jasmine; JIRA, Gradle, Sonar. Working with Cloud platforms for deployment and AI based engineering tools. Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

hackajob is collaborating with Verisk to connect them with exceptional tech professionals for this role. Provide leadership, mentorship, and guidance to business analysts and QA team members on manual and automated testing Collaborate with product owners and business analysts to ensure user stories are well-defined, testable, and include measurable acceptance criteria Identify edge cases, risks, and requirement gaps early in the planning process to strengthen story quality and test coverage Participate in Agile ceremonies and sprint planning to advocate for quality throughout the development lifecycle Define and execute test strategies, including manual test cases, automated scripts, and scenarios for web and API testing Develop and maintain automated test suites and ensure effective integration into the CI/CD pipeline Identify, track, and ensure timely resolution of defects, including root cause analysis and process improvement Continuously enhance QA processes, tools, and standards to improve efficiency and product quality Collaborate across QA, development, and product teams to align on quality goals, timelines, and delivery expectations Support User Acceptance Testing (UAT) and incorporate customer feedback to ensure a high-quality release Ensure the final product meets user expectations for functionality, performance, and usabiliD Qualifications Bachelor’s degree in computer science or a related field, or equivalent practical experience 6+ years of proven experience in the software development industry, working in collaborative team environments 6+ years of experience using automation tools such as Selenium WebDriver with programming languages like Python, C#, or Java 5+ years of hands-on experience testing and automating web services, including RESTful APIs 3+ years of experience in performance testing using tools such as Apache JMeter 2+ years of experience in mobile web application testing automation using Appium Strong experience with object-oriented programming languages such as Java and C#/.NET Good to Have- Experience working with CI/CD technologies such as Bamboo, Bitbucket, Octopus Deploy, and Maven Working knowledge of API testing tools such as JavaScript, RestAssured Solid understanding of software engineering best practices across the full software development lifecycle, including coding standards, code reviews, source control, build processes, testing, and operations Experience or familiarity with AWS cloud services Demonstrate strong written and verbal communication skills Proven ability to learn new technologies and adapt in a dynamic environment Familiarity with Atlassian tools, including Jira and Confluence Working knowledge of Agile methodologies, particularly Scrum Experience operating in a Continuous Integration (CI) environment About Us For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions - offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume.

Posted 2 days ago

Apply

2.0 - 10.0 years

0 Lacs

India

Remote

Pay Range: ₹400-500/hour Location: Remote (India) Mode: One-to-One Sessions Only (No batch teaching) We are hiring a Part-Time PySpark, Databricks Tutor who can deliver personalized, one-on-one online sessions to college and university-level students . The ideal candidate should have hands-on experience in big data technologies , particularly PySpark and Databricks , and should be comfortable teaching tools and techniques commonly used in the computer science and data engineering fields . Key Responsibilities: Deliver engaging one-to-one remote tutoring sessions focused on PySpark, Apache Spark, Databricks , and related tools. Teach practical use cases, project implementation techniques, and hands-on coding for real-world applications. Adapt teaching style based on individual student levels – beginners to advanced. Provide support with assignments, project work, and interview preparation. Ensure clarity in communication and foster an interactive learning environment. Required Skills & Qualifications: Experience: 2 to 10 years in the field of big data, data engineering, or related roles using PySpark and Databricks. Education: Bachelor’s or Master’s degree in Computer Science, Data Science, or relevant field. Strong English communication skills – both verbal and written. Familiarity with Spark SQL, Delta Lake, notebooks, and data pipelines. Ability to teach technical concepts with simplicity and clarity. Job Requirements: Freshers with strong knowledge and teaching ability may also apply. Must have a personal laptop and stable Wi-Fi connection . Must be serious and committed to long-term part-time work. Candidates who have applied before should not reapply . 💡 Note: This is a remote, part-time opportunity , and sessions will be conducted one-to-one , not in batch format. This role is ideal for professionals, freelancers, or educators passionate about sharing knowledge. 📩 Apply now only if you agree with the pay rate (₹400-500/hr) and meet the listed criteria. Let’s inspire the next generation of data engineers!

Posted 2 days ago

Apply

4.0 years

0 Lacs

Andaman and Nicobar Islands, India

On-site

Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description Analyze business problems to be solved with automated systems. Provide technical expertise in identifying systems that are cost-effective and meet user requirements. Configure system settings and options; plans and build unit, integration and acceptance testing; and create specifications for systems to meet our requirements. Design details of automated systems. Provide consultation to users around automated systems. You will report to the Engineering Manager IT and work in a hybrid capacity from our Hinjewadi-Pune ,Noida, Bangalore India office. Your Responsibilities Analyse the existing data from SAP and extract insights to provide smart decisions working with developers on the E commerce Project Prepare new products for the SAP by establishing linkages to taxonomy, classification system, images, documentation, drawings Publish new products to the online catalogue Monitor SAP data quality and completeness. Maintain SAP data Implement the SAP translation process and Implement SAP enrichment/improvement projects Monitor and support data integrations The Essentials - You Will Have Bachelor's Degree in computer science, management information systems, engineering, or related field 4+ years of experience with Data Setup Experience with in-memory database/cache like Redis/Ehcache 1+ years of experience with Integration platforms like MuleSoft Experience with DevOps tooling and scripting Experience with database modeling for performance and scaling applications Experience with front end UI frameworks like Angular/React Experience with PRODUCT, CUSTOMER and Commerce frameworks and applications Experience with event driven/stream processing. It may/may not be with Kafka, but the ability to process data changes in near real time will be critical for this capability. Some examples- RabbitMQ, Apache Pulsar, Google Pub/Sub From an integration standpoint, experience with event-driven architecture and message queuing. The Preferred - You Might Also Have Working knowledge of a broad range of industrial automation products . From an integration standpoint, experience with any event-driven architecture and message queuing. Work with new technologies and changing our requirements Work with multiple partners and influence project decisions Temperament Assist colleagues through change and support change management processes Adapt to competing demands and IPC - Information Processing Capability (Factors of Complexity) Work on issues of moderate scope where analysis of situations or data requires a review of relevant factors Exercise judgement within defined practices to determine appropriate action Apply process improvements to facilitate improved outcomes Implement processes across business/function to achieve assigned goals Distil information from different data sources and the capability to tell the "story" behind it, and recommendations for next steps Accepts Role Requirements What We Offer Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.

Posted 2 days ago

Apply

4.0 years

0 Lacs

Delhi, India

On-site

Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description Analyze business problems to be solved with automated systems. Provide technical expertise in identifying systems that are cost-effective and meet user requirements. Configure system settings and options; plans and build unit, integration and acceptance testing; and create specifications for systems to meet our requirements. Design details of automated systems. Provide consultation to users around automated systems. You will report to the Engineering Manager IT and work in a hybrid capacity from our Hinjewadi-Pune ,Noida, Bangalore India office. Your Responsibilities Analyse the existing data from SAP and extract insights to provide smart decisions working with developers on the E commerce Project Prepare new products for the SAP by establishing linkages to taxonomy, classification system, images, documentation, drawings Publish new products to the online catalogue Monitor SAP data quality and completeness. Maintain SAP data Implement the SAP translation process and Implement SAP enrichment/improvement projects Monitor and support data integrations The Essentials - You Will Have Bachelor's Degree in computer science, management information systems, engineering, or related field 4+ years of experience with Data Setup Experience with in-memory database/cache like Redis/Ehcache 1+ years of experience with Integration platforms like MuleSoft Experience with DevOps tooling and scripting Experience with database modeling for performance and scaling applications Experience with front end UI frameworks like Angular/React Experience with PRODUCT, CUSTOMER and Commerce frameworks and applications Experience with event driven/stream processing. It may/may not be with Kafka, but the ability to process data changes in near real time will be critical for this capability. Some examples- RabbitMQ, Apache Pulsar, Google Pub/Sub From an integration standpoint, experience with event-driven architecture and message queuing. The Preferred - You Might Also Have Working knowledge of a broad range of industrial automation products . From an integration standpoint, experience with any event-driven architecture and message queuing. Work with new technologies and changing our requirements Work with multiple partners and influence project decisions Temperament Assist colleagues through change and support change management processes Adapt to competing demands and IPC - Information Processing Capability (Factors of Complexity) Work on issues of moderate scope where analysis of situations or data requires a review of relevant factors Exercise judgement within defined practices to determine appropriate action Apply process improvements to facilitate improved outcomes Implement processes across business/function to achieve assigned goals Distil information from different data sources and the capability to tell the "story" behind it, and recommendations for next steps Accepts Role Requirements What We Offer Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. [Data Engineer] What You Will Do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Ø Design, develop, and maintain data solutions for data generation, collection, and processing Ø Be a key team member that assists in design and development of the data pipeline Ø Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Ø Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Ø Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Ø Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Ø Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Ø Implement data security and privacy measures to protect sensitive data Ø Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Ø Collaborate and communicate effectively with product teams Ø Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Ø Identify and resolve complex data-related challenges Ø Adhere to best practices for coding, testing, and designing reusable code/component Ø Explore new tools and technologies that will help to improve ETL platform performance Ø Participate in sprint planning meetings and provide estimations on technical implementation What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Basic Qualifications and Experience: Master's degree / Bachelor's degree and 5 to 9 years Computer Science, IT or related field experience Functional Skills: Must-Have Skills Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Excellent problem-solving skills and the ability to work with large, complex datasets Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 days ago

Apply

6.0 years

0 Lacs

India

Remote

Real people. Real service. At SupplyHouse.com, we value every individual team member and cultivate a community where people come first. Led by our core values of G enerosity, R espect, I nnovation, T eamwork, and GRIT, we’re dedicated to maintaining a supportive work environment that celebrates diversity and empowers everyone to reach their full potential. As an industry-leading e-commerce company specializing in HVAC, plumbing, heating, and electrical supplies since 2004, we strive to foster growth while providing the best possible experience for our customers. Through an Employer of Record (EOR), we are looking for a new Sr. Software Development Engineer in India to join our growing IT Team. This individual will report into our Full Stack Team Lead and have the opportunity to work on impactful projects that enhance our e-commerce platform and internal operations, while honing your skills in backend and full stack development. If you’re passionate about creating user-friendly interfaces, building scalable systems, and contributing to innovative solutions in a collaborative and fun environment, we’d love to hear from you! Role Type: Full-Time Location: Remote from India Schedule: Monday through Friday with a minimum schedule overlap of 4-5 hours per day with 8:00 a.m. to 5:00 p.m. U.S. Eastern Time to ensure effective collaboration Base Salary: $35,000 - $40,000 USD Responsibilities: Participate in all phases of software development: requirements, design, construction, testing, deployment, and maintenance Design and develop reliable and scalable distributed systems Ensure system reliability, optimized performance, and compliance with security policies Ensure industry standard development best practices are observed, including accessibility and privacy compliance Build reusable code and libraries for future use Assess the technical feasibility of UI/UX designs and partner with business analysts to refine project requirements Collaborate with project owners and development teams to ensure implementation, designs are in-sync, and to deliver client facing products Evaluate technical designs and conduct code reviews Serve as a mentor to junior team members Review requests to address features/issues submitted by various internal departments as well as provide solutions and estimates for such requests Maintain current technical knowledge to support rapidly changing technology, constantly looking for modern technologies and working with the team in introducing these technologies Requirements: Bachelor’s degree or foreign equivalent in Computer Science, Engineering, Information Technology, or a related field and 6+ years of progressive experience. Alternatively, a Master's degree or foreign equivalent in Computer Science, Engineering, Information Technology, or a related field and 4+ years of progressive experience. 4+ years of professional experience with Java frameworks such as Spring, Struts, Hibernate 4+ years of professional experience with relational databases (MySQL and/or Oracle) Deep understanding of data structures, algorithms, and system design Experience making complex backend architecture design choices Passion for web technologies, and keeping up to date with new tools and techniques Experience with the full Software Development Lifecycle: frontend and backend web application development, implementing business logic, and developing user interfaces Ability to work with minimal technical supervision and supplemental engineering support, while responding efficiently to multiple program priorities Linux, Shell, and Perl scripting, writing, and executing UNIX commands, and utilizing Linux servers to debug, deploy code, install, and monitor software to be used in testing and production environments Experience with webservers including Apache and Nginx Why work with us: We have awesome benefits – We offer a wide variety of benefits to help support you and your loved ones. These include: Comprehensive and affordable medical, dental, vision, and life insurance options; Competitive Provident Fund contributions; Paid time off and holidays; Mental health support and wellbeing program; Company-provided equipment and one-time $250 USD work from home stipend; $750 USD annual professional development budget; Company rewards and recognition program; And more! We promote work-life balance – We value your time and encourage a healthy separation between your professional and personal life to feel refreshed and recharged. Look out for our wellness initiatives! We support growth– We strive to innovate every day. In an exciting and evolving industry, we provide potential for career growth through our hands-on training, diversity and inclusion initiatives, opportunities for internal mobility, and professional development budget. We give back –We live and breathe our core value, Generosity, by giving back to the trades and organizations around the world. We make a difference through donation drives, employee-nominated contributions, support for DE&I organizations, and more. We listen – We value hearing from our employees. Everyone has a voice, and we encourage you to use it! We actively elicit feedback through our monthly town halls, regular 1:1 check-ins, and company-wide ideas form to incorporate suggestions and ensure our team enjoys coming to work every day. Check us out and learn more at https://www.supplyhouse.com/our-company! Additional Details: Remote employees are expected to work in a distraction-free environment. Personal devices, background noise, and other distractions should be kept to a minimum to avoid disrupting virtual meetings or business operations. SupplyHouse.com is an Equal Opportunity Employer, strongly values inclusion, and encourages individuals of all backgrounds and experiences to apply for this position. To ensure fairness, all application materials, assessments, and interview responses must reflect your own original work. The use of AI tools, plagiarism, or any uncredited assistance is not permitted at any stage of the hiring process and may result in disqualification. We appreciate your honesty and look forward to seeing your skills. We are committed to providing a safe and secure work environment and conduct thorough background checks on all potential employees in accordance with applicable laws and regulations. All emails from the SupplyHouse team will only be sent from an @supplyhouse.com email address. Please exercise caution if you receive an email from an alternate domain.

Posted 2 days ago

Apply

8.0 years

20 - 40 Lacs

India

On-site

Role: Senior Graph Data Engineer (Neo4j & AI Knowledge Graphs) Experience: 8+ years Type: Contract We’re hiring a Graph Data Engineer to design and implement advanced Neo4j-powered knowledge graph systems for our next-gen AI platform. You'll work at the intersection of data engineering, AI/ML, and financial services , helping build the graph infrastructure that powers semantic search, investment intelligence, and automated compliance for venture capital and private equity clients. This role is ideal for engineers who are passionate about graph data modeling , Neo4j performance , and enabling AI-enhanced analytics through structured relationships. What You'll Do Design Knowledge Graphs: Build and maintain Neo4j graph schemas modeling complex fund administration relationships — investors, funds, companies, transactions, legal docs, etc. Graph-AI Integration: Work with GenAI teams to power RAG systems, semantic search, and graph-enhanced NLP pipelines. ETL & Data Pipelines: Develop scalable ingestion pipelines from sources like FundPanel.io, legal documents, and external market feeds using Python, Spark, or Kafka. Optimize Graph Performance: Craft high-performance Cypher queries, leverage APOC procedures, and tune for real-time analytics. Graph Algorithms & Analytics: Implement algorithms for fraud detection, relationship scoring, compliance, and investment pattern analysis. Secure & Scalable Deployment: Implement clustering, backups, and role-based access on Neo4j Aura or containerized environments. Collaborate Deeply: Partner with AI/ML, DevOps, data architects, and business stakeholders to translate use cases into scalable graph solutions. What You Bring 7+ years in software/data engineering; 2+ years in Neo4j and Cypher. Strong experience in graph modeling, knowledge graphs, and ontologies. Proficiency in Python, Java, or Scala for graph integrations. Experience with graph algorithms (PageRank, community detection, etc.). Hands-on with ETL pipelines, Kafka/Spark, and real-time data ingestion. Cloud-native experience (Neo4j Aura, Azure, Docker/K8s). Familiarity with fund structures, LP/GP models, or financial/legal data a plus. Strong understanding of AI/ML pipelines, especially graph-RAG and embeddings. Use Cases You'll Help Build AI Semantic Search over fund documents and investment entities. Investment Network Analysis for GPs, LPs, and portfolio companies. Compliance Graphs modeling fund terms and regulatory checks. Document Graphs linking LPAs, contracts, and agreements. Predictive Investment Models enhanced by graph relationships. Skills: java,machine learning,spark,apache spark,neo4j aura,ai,azure,cloud-native technologies,data,ai/ml pipelines,scala,python,cypher,graphs,ai knowledge graphs,graph data modeling,apoc procedures,semantic search,etl pipelines,data engineering,neo4j,etl,cypher query,pipelines,graph schema,kafka,kafka streams,graph algorithms

Posted 2 days ago

Apply

40.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Amgen Amgen harnesses the best of biology and technology to fight the world's toughest diseases, and make people's lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what fs known today. About The Role Role Description: We are looking for an Associate Data Engineer with deep expertise in writing data pipelines to build scalable, high-performance data solutions. The ideal candidate will be responsible for developing, optimizing and maintaining complex data pipelines, integration frameworks, and metadata-driven architectures that enable seamless access and analytics. This role prefers deep understanding of the big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What We Expect From You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Bachelor’s degree and 2 to 4 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), AWS, Redshift, Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools. Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores. Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Good-to-Have Skills: Experience with data modeling, performance tuning on relational and graph databases ( e.g. Marklogic, Allegrograph, Stardog, RDF Triplestore). Understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platform Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Professional Certifications : AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting As an Associate Data Engineer at Amgen, you will be involved in the development and maintenance of data infrastructure and solutions. You will collaborate with a team of data engineers to design and implement data pipelines, perform data analysis, and ensure data quality. Your strong technical skills, problem-solving abilities, and attention to detail will contribute to the effective management and utilization of data for insights and decision-making.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About The Role Grade Level (for internal use): 09 The Team The current team is composed of highly skilled engineers with solid development background who build and manage tier-0 platforms in AWS cloud environments. In their role, they will play a pivotal part in shaping the platform architecture and engineering. Their additional tasks include, exploring new innovative tools that benefit the organization needs, developing services and tools around the platform, establishing standards, creating reference implementations, and providing support for application integrations on need basis. The Impact This role is instrumental in constructing and maintaining dependable production systems within cloud environments. The team bears the crucial responsibility for ensuring high availability, minimizing latency, optimizing performance, enhancing efficiency, overseeing change management, implementing robust monitoring practices, responding to emergencies, and strategically planning for capacity. The impact of this team is pivotal for the organization, given its extensive application portfolio, necessitating a steadfast commitment to achieving and maintaining a 99.9% uptime, thus ensuring the reliability and stability of the firm's digital infrastructure. What’s In It For You S&P Global is an employee friendly company with various benefits and with primary focus on skill development. The technology division has a wide variety of yearly goals that help the employee train and certify in niche technologies like: Generative AI, Transformation of applications to CaaS, CI/CD/CD gold transformation, Cloud modernization, Develop leadership skills and business knowledge training. Essential Duties & Responsibilities As part of a global team of Engineers, deliver highly reliable technology products. Strong focus on developing robust solutions meeting high-security standards. Build and maintain new applications/platforms for growing business needs. Design and build future state architecture to support new use cases. Ensure scalable and reusable architecture as well as code quality. Integrate new use cases and work with global teams. Work with/support users to understand issues, develop root cause analysis and work with the product team for the development of enhancements/fixes. Become an integral part of a high performing global network of engineers/developers working from Colorado, New York, and India to help ensure 24x7 reliability for critical business applications. As part of a global team of engineers/developers, deliver continuous high reliability to our technology services. Strong focus towards developing permanent fixes to issues and heavy automation of manual tasks. Provide technical guidance to junior level resources. Works on analyzing/researching alternative solutions and developing/implementing recommendations accordingly. Qualifications Required: Bachelor / MS degree in Computer Science, Engineering or a related subject Good written and oral communication skills. Must have 3+ years of working experience in Java with Spring technology Must have API development experience Work experience with asynchronous/synchronous messaging using MQ, etc. Ability to use CICD flow and distribution pipelines to deploy applications Working experience with DevOps tools such as Git, Azure DevOps, Jenkins, Maven Solid understanding of Cloud technologies and managing infrastructures Experience in developing, deploying & debugging cloud applications Strong knowledge of Functional programming, Linux etc Nice To Have Experience in building single-page applications with Angular or ReactJS in conjunction with Python scripting. Working experience with API Gateway, Apache and Tomcat server, Helm, Ansible, Terraform, CI/CD, Azure DevOps, Jenkins, Git, Splunk, Grafana, Prometheus, Jaeger(or other OTEL products), Flux, LDAP, OKTA, Confluent Platform, Active MQ, AWS, Kubernetes Location: Hyderabad, India Hybrid model: twice a week work from office is mandatory. Shift time: 12 pm to 9 pm IST. About S&P Global Ratings At S&P Global Ratings, our analyst-driven credit ratings, research, and sustainable finance opinions provide critical insights that are essential to translating complexity into clarity so market participants can uncover opportunities and make decisions with conviction. By bringing transparency to the market through high-quality independent opinions on creditworthiness, we enable growth across a wide variety of organizations, including businesses, governments, and institutions. S&P Global Ratings is a division of S&P Global (NYSE: SPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit www.spglobal.com/ratings What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. S&P Global has a Securities Disclosure and Trading Policy (“the Policy”) that seeks to mitigate conflicts of interest by monitoring and placing restrictions on personal securities holding and trading. The Policy is designed to promote compliance with global regulations. In some Divisions, pursuant to the Policy’s requirements, candidates at S&P Global may be asked to disclose securities holdings. Some roles may include a trading prohibition and remediation of positions when there is an effective or potential conflict of interest. Employment at S&P Global is contingent upon compliance with the Policy. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 311026 Posted On: 2025-07-30 Location: Hyderabad, Telangana, India

Posted 2 days ago

Apply

2.0 - 4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary As a Data Analyst, you will be responsible for Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. About The Role Location – Hyderabad Hybrid About The Role: As a Data Analyst, you will be responsible for Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. Key Responsibilities: Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. Collaborate with cross-functional teams, including data analysts, business analyst and BI, to understand data requirements and design appropriate solutions. Build and maintain data infrastructure in the cloud, ensuring high availability, scalability, and security. Write clean, efficient, and reusable code in scripting languages, such as Python or Scala, to automate data workflows and ETL processes. Implement real-time and batch data processing solutions using streaming technologies like Apache Kafka, Apache Flink, or Apache Spark. Perform data quality checks and ensure data integrity across different data sources and systems. Optimize data pipelines for performance and efficiency, identifying and resolving bottlenecks and performance issues. Collaborate with DevOps teams to deploy, automate, and maintain data platforms and tools. Stay up to date with industry trends, best practices, and emerging technologies in data engineering, scripting, streaming data, and cloud technologies Essential Requirements: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field with an overall experience of 2-4 Years. Proven experience as a Data Engineer or similar role, with a focus on scripting, streaming data pipelines, and cloud technologies like AWS, GCP or Azure. Strong programming and scripting skills in languages like Python, Scala, or SQL. Experience with cloud-based data technologies, such as AWS, Azure, or Google Cloud Platform. Hands-on experience with streaming technologies, such as AWS Streamsets, Apache Kafka, Apache Flink, or Apache Spark Streaming. Strong experience with Snowflake (Required) Proficiency in working with big data frameworks and tools, such as Hadoop, Hive, or HBase. Knowledge of SQL and experience with relational and NoSQL databases. Familiarity with data modelling and schema design principles. Strong problem-solving skills and the ability to work in a fast-paced, collaborative environment. Excellent communication and teamwork skills. Commitment To Diversity And Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility And Accommodation: Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards

Posted 2 days ago

Apply

0 years

0 Lacs

Pune/Pimpri-Chinchwad Area

On-site

Req ID: 332013 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a AWS Devops Engineer to join our team in Pune, Mahārāshtra (IN-MH), India (IN). Key Responsibilities Development & Build: Build and maintain a robust, scalable real-time data streaming platform leveraging AWS and Confluent Cloud Infrastructure. AWS Services: strong knowledge of AWS services, particularly those relevant to stream processing and serverless components like Lambda Functions. Performance Monitoring: Continuously monitor and troubleshoot streaming platform performance issues to ensure optimal functionality. Collaboration: Work closely with cross-functional teams to onboard various data products into the streaming platform and support existing implementations. Version Control: Manage code using Git, ensuring best practices in version control are followed. Infrastructure as Code (IaC): Apply expertise in Terraform for efficient infrastructure management. CI/CD Practices: Implement robust CI/CD pipelines using GitHub Actions to automate deployment workflows. Monitor expiration of service principal secrets or certificates, use Azure DevOps REST API to automate renewal, implement alerts and documentation for debugging failed connections. Mandatory Skillsets The candidate must have: Strong proficiency in AWS services , including IAM Roles, Access Control RBAC, S3, Containerized Lambda Functions, VPC, Security Groups, RDS, MemoryDB, NACL, CloudWatch, DNS, Network Load Balancer, Directory Services and Identity Federation, AWS Tagging Configuration, Certificate Management, etc. Hands-on experience in Kubernetes (EKS) , with expertise in Imperative and Declarative approaches for managing resources/services like Pods, Deployments, Secrets, ConfigMaps, DaemonSets, Services, IRSA, Helm Charts, and deployment tools like ArgoCD. Expertise in Datadog , including integration, monitoring key metrics and logs, and creating meaningful dashboards and alerts. Strong understanding of Docker , including containerisation and image creation. Excellent programming skills in Python and Go , capable of writing efficient scripts. Familiarity with Git concepts for version control. Deep knowledge of Infrastructure as Code principles , particularly Terraform. Experience with CI/CD tools , specifically GitHub Actions. Understanding of security best practices , including knowledge of Snyk, Sonar Cloud, and Code Scene. Nice-to-Have Skillsets Prior experience with streaming platforms, particularly Apache Kafka (including producer and consumer applications). Knowledge of unit testing around Kafka topics, consumers, and producers. Experience with Splunk integration for logging and monitoring. Familiarity with Software Development Life Cycle (SDLC) principles. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Nice to meet you! We’re a leader in data and AI. Through our software and services, we inspire customers around the world to transform data into intelligence - and questions into answers. We’re also a debt-free multi-billion-dollar organization on our path to IPO-readiness. If you're looking for a dynamic, fulfilling career coupled with flexibility and world-class employee experience, you'll find it here. About The Job Role : DevOps Engineer Responsibilities Enable stream-aligned DevOps teams to deliver rapid value by delivering platforms they can build on Enable Continuous Integration and Delivery by providing a standardized pipeline experience for teams Design novel solutions using both public and private cloud platforms to solve business needs Work with Information Security and others to understand what needs to be handled by the platforms we support Construct Infrastructure as Code routines that are leveraged to ensure cloud services have configuration needed for ongoing support Support legacy environments while we work with teams in migrating to DevOps practices and cloud adoption What We’re Looking For You’re curious, passionate, authentic, and accountable. These are our values and influence everything we do. You have a Bachelor's degree in computer science, information technology, or a similar quantitative field. You have a passion for automation and empowering others through self-help You have experience writing scripts and/or APIs in a modern language (Python, Go, PowerShell, etc.) You have experience delivering solutions in one or more public clouds, i.e., Microsoft Azure, Amazon AWS, etc. You have familiarity with Continuous Integration and Continuous Delivery (CI/CD) You have familiarity with fundamental cloud, security, networking, and distributed computing environment concepts You have familiarity administering one of the following platforms: Apache, Atlassian Bamboo, Boomi, Cloud Foundry, Harbor, RabbitMQ, or Tomcat The nice to haves Experience providing services as a platform to enable other teams to build from Experience with monitoring and writing self-healing routines to ensure platform uptime Experience with Python or PowerShell: writing scripts, applications, or APIs Experience with Ansible, Terraform, or other Infrastructure as Code/Configuration Management applications Experience with developing/managing applications in Microsoft’s Azure Cloud Experience with git, GitHub, and GitHub Actions for source control and Continuous Integration/Delivery Experience with supporting applications in an enterprise environment on both Linux and Windows Experience working in an Agile sprint-based environment Knowledge with the use and/or administration in Kubernetes Other Knowledge, Skills, And Abilities Strong oral and written communication skills Strong prioritization and analytical skills Ability to work independently and as part of a global team Ability to manage time across multiple projects Ability to communicate designs and decisions to peers and internal customers Ability to produce clear and concise system and process documentation Ability and willingness to participate in an afterhours on-call rotation Required Skills : Apache, Atlassian Bamboo, Boomi, Cloud Foundry, Harbor, RabbitMQ, Tomcat , Python , Powershell , CI\CD Cloud : Azure Experience : 3 to 7 years Diverse and Inclusive At SAS, it’s not about fitting into our culture – it’s about adding to it. We believe our people make the difference. Our diverse workforce brings together unique talents and inspires teams to create amazing software that reflects the diversity of our users and customers. Our commitment to diversity is a priority to our leadership, all the way up to the top; and it’s essential to who we are. To put it plainly: you are welcome here.

Posted 2 days ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description Analyze business problems to be solved with automated systems. Provide technical expertise in identifying systems that are cost-effective and meet user requirements. Configure system settings and options; plans and build unit, integration and acceptance testing; and create specifications for systems to meet our requirements. Design details of automated systems. Provide consultation to users around automated systems. You will report to the Engineering Manager IT and work in a hybrid capacity from our Hinjewadi-Pune ,Noida, Bangalore India office. Your Responsibilities Analyse the existing data from SAP and extract insights to provide smart decisions working with developers on the E commerce Project Prepare new products for the SAP by establishing linkages to taxonomy, classification system, images, documentation, drawings Publish new products to the online catalogue Monitor SAP data quality and completeness. Maintain SAP data Implement the SAP translation process and Implement SAP enrichment/improvement projects Monitor and support data integrations The Essentials - You Will Have Bachelor's Degree in computer science, management information systems, engineering, or related field 4+ years of experience with Data Setup Experience with in-memory database/cache like Redis/Ehcache 1+ years of experience with Integration platforms like MuleSoft Experience with DevOps tooling and scripting Experience with database modeling for performance and scaling applications Experience with front end UI frameworks like Angular/React Experience with PRODUCT, CUSTOMER and Commerce frameworks and applications Experience with event driven/stream processing. It may/may not be with Kafka, but the ability to process data changes in near real time will be critical for this capability. Some examples- RabbitMQ, Apache Pulsar, Google Pub/Sub From an integration standpoint, experience with event-driven architecture and message queuing. The Preferred - You Might Also Have Working knowledge of a broad range of industrial automation products . From an integration standpoint, experience with any event-driven architecture and message queuing. Work with new technologies and changing our requirements Work with multiple partners and influence project decisions Temperament Assist colleagues through change and support change management processes Adapt to competing demands and IPC - Information Processing Capability (Factors of Complexity) Work on issues of moderate scope where analysis of situations or data requires a review of relevant factors Exercise judgement within defined practices to determine appropriate action Apply process improvements to facilitate improved outcomes Implement processes across business/function to achieve assigned goals Distil information from different data sources and the capability to tell the "story" behind it, and recommendations for next steps Accepts Role Requirements What We Offer Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Role : DevOps Engineer Responsibilities Enable stream-aligned DevOps teams to deliver rapid value by delivering platforms they can build on Enable Continuous Integration and Delivery by providing a standardized pipeline experience for teams Design novel solutions using both public and private cloud platforms to solve business needs Work with Information Security and others to understand what needs to be handled by the platforms we support Construct Infrastructure as Code routines that are leveraged to ensure cloud services have configuration needed for ongoing support Support legacy environments while we work with teams in migrating to DevOps practices and cloud adoption What We’re Looking For You’re curious, passionate, authentic, and accountable. These are our values and influence everything we do. You have a Bachelor's degree in computer science, information technology, or a similar quantitative field. You have a passion for automation and empowering others through self-help You have experience writing scripts and/or APIs in a modern language (Python, Go, PowerShell, etc.) You have experience delivering solutions in one or more public clouds, i.e., Microsoft Azure, Amazon AWS, etc. You have familiarity with Continuous Integration and Continuous Delivery (CI/CD) You have familiarity with fundamental cloud, security, networking, and distributed computing environment concepts You have familiarity administering one of the following platforms: Apache, Atlassian Bamboo, Boomi, Cloud Foundry, Harbor, RabbitMQ, or Tomcat Experience with Python or PowerShell: writing scripts, applications, or APIs Experience with Ansible, Terraform, or other Infrastructure as Code/Configuration Management applications Experience with developing/managing applications in Microsoft’s Azure Cloud Experience with git, GitHub, and GitHub Actions for source control and Continuous Integration/Delivery The nice to haves Experience providing services as a platform to enable other teams to build from Experience with monitoring and writing self-healing routines to ensure platform uptime Experience with supporting applications in an enterprise environment on both Linux and Windows Experience working in an Agile sprint-based environment Knowledge with the use and/or administration in Kubernetes Other Knowledge, Skills, And Abilities Strong oral and written communication skills Strong prioritization and analytical skills Ability to work independently and as part of a global team Ability to manage time across multiple projects Ability to communicate designs and decisions to peers and internal customers Ability to produce clear and concise system and process documentation Ability and willingness to participate in an afterhours on-call rotation Required Skills : Apache, Atlassian Bamboo, Boomi, Cloud Foundry, Harbor, RabbitMQ, Tomcat , Python , Powershell , CI\CD ,Python ,Docker , FastAPI Cloud : Azure , AWS Experience : 3 to 7 years Why SAS We love living the #SASlife and believe that happy, healthy people have a passion for life, and bring that energy to work. No matter what your specialty or where you are in the world, your unique contributions will make a difference. Our multi-dimensional culture blends our different backgrounds, experiences, and perspectives. Here, it isn’t about fitting into our culture, it’s about adding to it - and we can’t wait to see what you’ll bring.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

Remote

Who we are We are Fluxon, a product development team founded by ex-Googlers and startup founders. We offer full-cycle software development from ideation and design to build and go-to-market. We partner with visionary companies, ranging from fast-growing startups to tech leaders like Google and Stripe, to turn bold ideas into products with the power to transform the world. The role is open to candidates based in Gurugram, India. About the role As a Senior Software Engineer at Fluxon, you’ll have the opportunity to bring products to market while learning, contributing, and growing with our team. You'll be responsible for: Driving end-to-end implementations all the way to the user, collaborating with your team to build and iterate in a dynamic environment Engaging directly with clients to understand business goals, give demos, and debug production issues Informing product requirements, identifying appropriate technical designs in partnership with our Product and Design teams Proactively communicating progress and challenges in your work and seeking help when you need it Performing code reviews and cross-feature validations Providing mentorship in your areas of expertise You'll work with a diversity of technologies, including: Languages TypeScript/JavaScript, Java, .Net, Python, Golang, Rust, Ruby on Rails, Kotlin, Swift Frameworks Next.js, React, Angular, Spring, Expo, FastAPI, Django, SwiftUI Cloud Service Providers Google Cloud Platform, Amazon Web Services, Microsoft Azure Cloud Services Compute Engine, AWS Amplify, Fargate, Cloud Run Apache Kafka, SQS, GCP CMS S3, GCS Technologies AI/ML, LLMs, Crypto, SPA, Mobile apps, Architecture redesign Google Gemini, OpenAI ChatGPT, Vertex AI, Anthropic Claude, Huggingface Databases Firestore(Firebase), PostgreSQL, MariaDB, BigQuery, Supabase Redis, Memcache Qualifications 3+years of industry experience in software development Experienced with the full product lifecycle, including CI/CD, testing, release management, deployment, monitoring and incident response Fluent in software design patterns, scalable system architectures, tooling, fundamentals of data structures and algorithms What we offer Exposure to high-profile SV startups and enterprise companies Competitive salary Fully remote work with flexible hours Flexible paid time off Profit-sharing program Healthcare Parental leave, including adoption and fostering Gym membership and tuition reimbursement Hands-on career development

Posted 2 days ago

Apply

0 years

0 Lacs

India

On-site

Job Summary: We are looking for a skilled Senior Data Engineer with strong expertise in Spark and Scala on the AWS platform. The ideal candidate should possess excellent problem-solving skills and hands-on experience in Spark-based data processing within a cloud-based ecosystem. This role offers the opportunity to independently execute diverse and complex engineering tasks, demonstrate a solid understanding of the end-to-end software development lifecycle, and collaborate effectively with stakeholders to deliver high-quality technical solutions. Key Responsibilities: Develop, analyze, debug, and enhance Spark-Scala programs. Work on Spark batch processing jobs, with the ability to analyze/debug using Spark UI and logs. Optimize performance of Spark applications and ensure scalability and reliability. Manage data processing tasks using AWS S3, AWS EMR clusters, and other AWS services. Leverage Hadoop ecosystem tools including HDFS, HBase, Hive, and MapReduce. Write efficient and optimized SQL queries; experience with PostgreSQL and Couchbase or similar databases is preferred. Utilize orchestration tools such as Kafka, NiFi, and Oozie. Work with monitoring tools like Dynatrace and CloudWatch. Contribute to the creation of High-Level Design (HLD) and Low-Level Design (LLD) documents and participate in reviews with architects. Support development and lower environments setup, including local IDE configuration. Follow defined coding standards, best practices, and quality processes. Collaborate using Agile methodologies for development, review, and delivery. Use supplementary programming languages like Python as needed. Required Skills: Mandatory: Apache Spark Scala Big Data Hadoop Ecosystem Spark SQL Additional Preferred Skills: Spring Core Framework Core Java, Hibernate, Multithreading AWS EMR, S3, CloudWatch HDFS, HBase, Hive, MapReduce PostgreSQL, Couchbase Kafka, NiFi, Oozie Dynatrace or other monitoring tools Python (as supplementary language) Agile Methodology

Posted 2 days ago

Apply

0.0 - 4.0 years

0 Lacs

Kolkata, West Bengal

On-site

Job Title: Software Developer Location: Sector 5, Salt Lake, Kolkata Shift Timings: Flexible Day Shift or Afternoon shift Week Offs: Saturday and Sunday Employment Type: Full Time On-Site or Hybrid Industry : Telecommunication, IT and Security Employment Type: Full Time Onsite Salary: Upto 18 LPA Who We Are: Salescom Services Private Limited is a one hundred percent subsidiary of a British Technology business. We provide IT, security and Telecommunication products and services to Enterprise and SMEs. We as an organization value people who bring forth a combination of Talent, proactiveness and a never say never attitude! We enable you with the right kind of knowledge and skills that will help you develop into a productive and outstanding professional. Our expertise lies in 360-degree project management, customer success, revenue assurance, account management, billing & analytics, quality and compliance, web security and IT Helpdesk in the space of technology and telecommunications. We are backed by a combined experience of over two decades that the board members have in this space, operating successful ventures, and acquisitions over the years. The founding members of Salescom have operated in Australia and the United Kingdom, running successful, and widely known technology and telecommunication ventures, and in Dec-2019, decided to launch its first captive unit in the heart of the IT workforce space, - Sector V - Kolkata, West Bengal. Job Overview: We are looking for an experienced Software Developer specializing in ASP.NET to build software using languages and technologies of the .NET framework. You should be a pro with third-party API integrations and user application programming journeys. In this role, you should be able to write smooth & functional code with a sharp eye for spotting defects. You should be a team player and an excellent communicator. If you are also passionate about the .NET framework and software design/architecture, we’d like to meet you. Your goal will be to work with internal teams to design, develop and maintain functional software of all kinds. Key Responsibilities: Desing and develop web application: Build robust and scalable web-based solutions using ASP.NET and C#. Optimize database interactions using SQL and NoSQL technologies like Microsoft SQL Server, PostgreSQL, and SQLite. Front End Implementation: Develop interactive user interfaces using modern frameworks (Blazor, React). Implement responsive design using Bootstrap, HTML, CSS, JavaScript, and jQuery. API Integration & Management: Integrate and maintain third-party SOAP and RESTful APIs. Ensure secure and efficient data exchanges across external systems. Testing & Quality Assurance: Use tools such as Jenkins to automate testing processes. Write and maintain unit and integration tests for consistent performance. Troubleshooting & Optimization: Identify and resolve software bugs and performance bottlenecks. Analyse prototype feedback and iterate quickly to improve solutions. Collaboration & Communication: Work closely with cross-functional teams to understand requirements. Document development progress and articulate technical solutions effectively. Continuous Improvement: Stay up to date with emerging technologies and coding practices. Contribute to code reviews and mentor junior developers. Pre Requisites: Required at least 4 years of Software development using ASP.NET, C#, SQL/ NoSQL (Microsoft SQL, PostgreSQL, SQlite etc) Experience with Modern Front-End Frameworks (Blazor, React etc) Hands on experience in Third Party SOAP and Rest API integrations. Experienced in Bootstrap, jQuery, HTML, CSS and JavaScript. Knowledge of standard unit testing tools such as Jenkins. Excellent troubleshooting skills in software prototypes. Excellent verbal and written communication skills. BSc / B Tech/ BCA in Computer Science, Engineering, or a related field Good to have skill set: Knowledge of .NET MVC Knowledge of .NET MAUI (Xamarin) Experience with CRM development Experience working in ISP, Telephony and MSP Experience with Apache HTTP & Nginx Experience with Debian & Debian based Linux Server Distributions (Eg – Ubuntu) What's in it for you: Competitive salary, periodic reviews and performance-based bonuses. Comprehensive health insurance coverage for self and chosen family defendants. Professional development opportunities, including training and company funded certifications Collaborative and inclusive work environment that values diversity and creativity Café facilities Free drop services back home Businesses We Own & Operate: https://v4consumer.co.uk https://v4one.co.uk How to Apply: Interested candidates are invited to submit their resume and cover letter to hr@salescom.in in confidence. Please label “Senior Software Developer Application” in the email subject line. All candidates will be treated equally, and we will base decisions on appointments on the merits of the candidates. We welcome applications from all candidates, regardless of any protected characteristic and are an equal opportunity employer

Posted 2 days ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

We are looking for an experienced and highly skilled GCP Data Engineer to join our team. The ideal candidate will be responsible for designing, developing, and maintaining robust data pipelines and cloud-based solutions on Google Cloud Platform to support data integration, transformation, and analytics. You will work closely with data analysts, architects, and other stakeholders to build scalable and efficient data systems. Key Responsibilities: Design, build, and maintain scalable ETL/ELT pipelines using GCP tools such as Cloud Dataflow , BigQuery , Cloud Composer , and Cloud Storage . Develop and optimize data models for analytics and reporting using BigQuery. Implement data quality, data governance, and metadata management best practices. Collaborate with data scientists, analysts, and other engineers to ensure data availability and reliability. Work with streaming data and batch processing frameworks to handle real-time data ingestion and processing. Monitor and troubleshoot data pipeline performance and ensure high availability. Develop automation and orchestration solutions using Cloud Functions , Cloud Composer (Apache Airflow) , or other tools. Ensure security, privacy, and compliance of data solutions in line with organizational and regulatory requirements. Required Skills And Qualifications: Bachelor's degree in Computer Science, Information Technology, Engineering, or related field. 6+ years of experience in data engineering or similar role, with 4+ years specifically on GCP. Strong hands-on experience with BigQuery , Cloud Dataflow , Cloud Pub/Sub , Cloud Storage , and Cloud Functions . Proficient in SQL, Python, and/or Java for data manipulation and automation. Experience with data modeling, warehousing concepts, and data architecture. Familiarity with DevOps practices, CI/CD pipelines, and Infrastructure as Code (e.g., Terraform). Strong problem-solving skills and the ability to work in an agile, collaborative environment. Preferred Qualification: GCP Professional Data Engineer Certification. Experience with Apache Beam, Airflow, Kafka, or similar tools. Understanding of machine learning pipelines and integration with AI/ML tools on GCP. Experience in multi-cloud or hybrid cloud environments. What we offer: Competitive salary and benefits package Flexible working hours and remote work opportunities Opportunity to work with cutting-edge technologies and cloud solutions Collaborative and supportive work culture Career growth and certification support

Posted 2 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Business Unit: Cubic Corporation Company Details: When you join Cubic, you become part of a company that creates and delivers technology solutions in transportation to make people’s lives easier by simplifying their daily journeys, and defense capabilities to help promote mission success and safety for those who serve their nation. Led by our talented teams around the world, Cubic is committed to solving global issues through innovation and service to our customers and partners. We have a top-tier portfolio of businesses, including Cubic Transportation Systems (CTS) and Cubic Defense (CD). Explore more on Cubic.com. Job Details: Job Summary : We’re looking for a Senior Systems Administrator who can blend strong technical know-how with a deep understanding of Linux and Azure environments to support and scale our growing infrastructure needs. Work Shift: 24x7 Rotational Experience: 5+ Years Primary Skills: Linux, Azure, Production Systems Secondary Skills: Windows Server, VMware Responsibilities : 🔹 Azure Administration Design, deploy, and manage resources in Azure using ARM templates and Terraform. Implement secure, scalable Azure environments with IAM and networking (VNets, NSGs, Load Balancers, Gateways). Monitor and optimize resources using Azure Monitor, Log Analytics, and Application Insights. Drive cost efficiency, disaster recovery planning, and resource utilization. Collaborate on cloud migrations, vulnerability remediation, and system modernization. Certification such as Azure Administrator Associate or Solutions Architect Expert is a huge plus! 🔹 Linux Administration (CentOS/Red Hat) Administer and maintain production-grade Linux servers, perform OS patching, tuning, and secure configurations (SELinux, iptables, etc.). Manage storage using LVM, configure NFS/SMB, and automate tasks via shell scripting or tools like Ansible/Chef. Support open-source tools like Nginx, Apache, SFTP/FTP, and assist in performance tuning and capacity planning. Troubleshoot system issues and work with hardware vendors when needed. 🔹 Windows + VMware (Secondary Skillset) Manage Windows Server environments alongside Hyper-V/VMware virtual infrastructure. Ensure backup, recovery, and business continuity measures are in place. Contribute to identity access management across hybrid environments. Minimum Job Requirements: Bachelor's degree, or equivalent years of experience in lieu of a degree. Five (5)+ years of hands-on experience in systems administration, with a focus on Azure and Linux. Proven expertise in enterprise IT environments, system performance, and infrastructure management. Ability to work in a rotational 24x7 support model. You bring a solution-driven mindset, stay updated with tech trends, and thrive in a collaborative, high-stakes environment. Worker Type: Employee

Posted 2 days ago

Apply

3.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

No Relocation Assistance Offered Job Number #167816 - Mumbai, Maharashtra, India Who We Are Colgate-Palmolive Company is a global consumer products company operating in over 200 countries specializing in Oral Care, Personal Care, Home Care, Skin Care, and Pet Nutrition. Our products are trusted in more households than any other brand in the world, making us a household name! Join Colgate-Palmolive, a caring, innovative growth company reimagining a healthier future for people, their pets, and our planet. Guided by our core values—Caring, Inclusive, and Courageous—we foster a culture that inspires our people to achieve common goals. Together, let's build a brighter, healthier future for all. Brief introduction - Role Summary/Purpose: We are excited to invite applications for the position of Full Stack Developer within our Global Tech Team. This role will support our team in deploying best in class technology to optimize and expand our digital engagement programs leading to better targeting and engagement with our professionals, customers and consumers. We are looking for a highly motivated individual to join our team to help realize our vision. The ideal candidate is very customer focused and can work well both independently and within a team. Candidate needs to be a self-starter- eager to learn and bring to bear new technologies to build the best digital experience. Responsibilities: Architect, Develop & Support web / full stack applications for different multi-functional Projects. Work with a distributed team and propose the right tech stack for the applications. Develops elite user interfaces and user experiences of applications. Implements the server-side logic and functionality of applications. Designs and interacts with databases, ensuring efficient storage and retrieval of data. Writes unit tests, conducts testing, and debugs code to ensure the reliability and functionality of the application. Act as a Full Stack Mentor to other developers in the Team Required Qualifications: Bachelor's Degree or equivalent in Computer Science, Information Technology, Mathematics, Engineering or similar degree At least 3+ years experience designing and deploying end to end web applications At least 3+ years experience with full product life cycle releases A deep understanding of web technologies (JavaScript, HTML, CSS), networking, debugging Experience developing frontend web applications in a reactive modern JavaScript framework such as React, Vue or Angular Demonstrable experience applying test driven development methodologies to sophisticated business problems Relational database technologies Experience in backend languages like Python, NodeJS Optimizing and scaling code in a production environment Handling source code with git Knowledge of and experience applying security standard methodologies and patterns Excellent diagnostic and solving skills Working on Agile/SCRUM development teams Static and dynamic analyzing toolsets Use of user centric design and applying user experience concepts Excellent verbal and written communication skills as well as customer relationship building skills Adapt to and work reliably with a variety of engagements and in high-reaching Strong organization and project management skills with the ability to handle sophisticated projects with many partners. Github, Github Actions, Apache Airflow Preferred Qualifications: Developing applications on cloud platforms (AWS, Azure, GCP) Containerization (Docker or Kubernetes) Experience with Data Flow, Data Pipeline and workflow management tools: Airflow, Airbyte, Cloud Composer, etc. Experience with Data Warehousing solutions: Snowflake, BigQuery, etc Our Commitment to Inclusion Our journey begins with our people—developing strong talent with diverse backgrounds and perspectives to best serve our consumers around the world and fostering an inclusive environment where everyone feels a true sense of belonging. We are dedicated to ensuring that each individual can be their authentic self, is treated with respect, and is empowered by leadership to contribute meaningfully to our business. Equal Opportunity Employer Colgate is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity, sexual orientation, national origin, ethnicity, age, disability, marital status, veteran status (United States positions), or any other characteristic protected by law. Reasonable accommodation during the application process is available for persons with disabilities. Please complete this request form should you require accommodation.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Description: Senior Data Developer I Location: Gurugram, India Employment Type: Full-Time Experience Level: Mid to Senior-Level Department: Data & Analytics / IT Job Summary We are seeking an experienced Data Developer with expertise in Microsoft Fabric, Azure Synapse Analytics, Databricks, and strong SQL development skills. The ideal candidate will work on end-to-end data solutions supporting analytics initiatives across clinical, regulatory, and commercial domains in the Life Sciences industry. Familiarity with Azure DevOps, and relevant certifications such as DP-700 and Databricks Data Engineer Associate/Professional are preferred. Power BI knowledge is highly preferable to support integrated analytics and reporting. Key Responsibilities Design, develop, and maintain scalable and secure data pipelines using Microsoft Fabric, Azure Synapse Analytics, and Azure Databricks to support critical business processes. Develop curated datasets for clinical, regulatory, and commercial analytics using SQL and PySpark. Create and support dashboards and reports using Power BI (highly preferred). Collaborate with cross-functional stakeholders to understand data needs and translate them into technical solutions. Work closely with ERP teams such as Salesforce.com and SAP S/4HANA to integrate and transform business-critical data into analytic-ready formats. Partner with Data Scientists to enable advanced analytics and machine learning initiatives by providing clean, reliable, and well-structured data. Ensure data quality, lineage, and documentation in accordance with GxP, 21 CFR Part 11, and industry best practices. Use Azure DevOps to manage code repositories, track tasks, and support agile delivery processes. Monitor, troubleshoot, and optimize data workflows for reliability and performance. Contribute to the design of scalable, compliant data models and architecture. Required Qualifications Bachelor’s or Master’s degree in Computer Science. 5+ years of experience in data development or data engineering roles. Hands-on Experience With Microsoft Fabric (Lakehouse, Pipelines, Dataflows) Azure Synapse Analytics (Dedicated/Serverless SQL Pools, Pipelines) Experience with Azure Data Factory, Apache Spark Azure Databricks (Notebooks, Delta Lake, Unity Catalog) SQL (complex queries, optimization, transformation logic) Familiarity with Azure DevOps (Repos, Pipelines, Boards). Understanding of data governance, security, and compliance in the Life Sciences domain. Certifications (Preferred) Microsoft Certified: DP-700 – Fabric Analytics Engineer Associate Databricks Certified Data Engineer Associate or Professional Preferred Skills Preferred Skills: Strong knowledge of Power BI (highly preferred) Familiarity with HIPAA, GxP, and 21 CFR Part 11 compliance Experience working with ERP data from Salesforce.com and SAP S/4HANA Exposure to clinical trial, regulatory submission, or quality management data Good understanding of AI and ML concepts Experience working with APIs Excellent communication skills and the ability to collaborate across global teams Location - Gurugram Mode - Hybrid

Posted 2 days ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

🚀 We’re Hiring: Big Data Engineer (4–8 Years Experience) 📍 Location: Kochi | 🏢 Mode: On-site | 💼 Employment Type: Full-time Are you passionate about building scalable big data solutions? Do you thrive in a high-performance environment where innovation meets impact? We’re looking for an experienced Big Data Engineer to join our team and help drive our data-driven transformation. You'll design and implement robust data pipelines, optimize distributed systems, and contribute to cutting-edge analytics and ML use cases. 🔧 Key Responsibilities Design and develop scalable big data processing pipelines. Implement data ingestion, transformation, and validation. Collaborate across teams to deliver data solutions for analytics & ML. Optimize systems for performance, reliability, and scalability. Monitor and troubleshoot performance bottlenecks. Document workflows, specs, and technical decisions. 🎓 Required Qualifications Bachelor’s in Computer Science, Engineering, or related field (Master’s preferred). 3+ years of experience in Big Data Engineering. Strong in Python, Java, or Scala. Hands-on with Apache Spark, Hadoop, Kafka, or Flink. Solid knowledge of SQL and relational databases (MySQL, PostgreSQL). Experience with ETL, data modeling, and data warehousing. Exposure to distributed computing and cloud platforms (AWS/GCP/Azure). Familiar with Docker, Kubernetes, and DevOps practices. ⚙️ Tools & Technologies IDEs: IntelliJ, Eclipse Build: Maven, Gradle Testing: JUnit, TestNG, Mockito Monitoring: Prometheus, Grafana, ELK APIs: Swagger, OpenAPI Messaging: Kafka Databases: MySQL, PostgreSQL, MongoDB, Redis ORM: Hibernate, Spring Data 📩 Ready to build the future with us? Apply now or tag someone who fits the role! If anyone interested share your updated resume to vishnu@narrowlabs.in #BigData #DataEngineer #ApacheSpark #Kafka #Hadoop #ETL #HiringNow #KochiJobs #OnsiteOpportunity #DataEngineeringJobs #TechJobsIndia #WeAreHiring #infopark #infoparkKochi #BigDataEngineer #Kochi

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies