Jobs
Interviews

3732 Indexing Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 years

0 Lacs

Gurugram, Haryana, India

On-site

We're Hiring | Java Technical Lead (7–12 Years Experience) Location: [Gurgaon/Pune] Experience Required: 7 to 12 Years Are you a hands-on Java expert with strong fundamentals in Core Java and a passion for leading high-performance teams? We're looking for a Java Technical Lead who can architect scalable solutions, guide developers, and own the end-to-end delivery of cloud-native applications. What You'll Do: Lead the design, development, and deployment of robust Java-based applications Ensure code quality, architecture standards, and best practices across the team Deep dive into Core Java – collections, multithreading, exception handling, performance tuning Integrate with MQ systems (IBM MQ, ActiveMQ, RabbitMQ) for reliable messaging Deploy and manage applications on AWS using EC2, Lambda, S3, RDS, etc. Work with MongoDB for scalable NoSQL data storage and retrieval Mentor team members and conduct code reviews Collaborate with QA, DevOps, and stakeholders for seamless delivery What You Bring: Strong expertise in Core Java (OOPs, data structures, multithreading, exception handling) Experience with Java frameworks – Spring Boot, Hibernate, JPA Proven track record of leading development teams or tech initiatives MQ Integration (IBM MQ, ActiveMQ, RabbitMQ, etc.) AWS deployment experience – EC2, S3, Lambda, CloudFormation, etc. MongoDB – indexing, aggregation, performance tuning Hands-on with CI/CD , Git, Docker, and cloud-native DevOps workflows Excellent problem-solving, debugging, and communication skills Bonus Points If You Have: Certifications in Java, AWS, MongoDB , or related tech Exposure to DevOps practices , automation pipelines, and microservices Experience working in Agile/Scrum environments If you're someone who can write clean code, solve complex problems, and lead by example — we'd love to talk.

Posted 1 day ago

Apply

0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Role: SEO Executive / Sr Executive Experience: 3-5 yrs Location: Jaipur Working Days - 5 Days Work from Office Company Description Founded in 2007, Angara Ecommerce Pvt. Ltd. is a subsidiary of Angara, Inc., a luxury jewellery e-commerce company headquartered in LA, California. We provide essential IT, marketing and accounting services to Angara, Inc., helping it become a global leader in handcrafted fine jewellery. Angara jewellery is shipped to over 65 countries, with offices in the US, Thailand, Australia, Canada, UK, and Ireland. Our offices in India are located in Jaipur. Angara is committed to bringing more colour to the world through our coloured gemstone jewellery and has been recognized with multiple industry awards for innovation and excellence. Role & Responsibility: Resolve technical SEO issues (mobile usability, crawling, structured data). Perform SEO audits using Screaming Frog, SEMrush, Ahrefs, GSC . Work with tech teams on code, URL structure, and structured data. Analyse traffic, internal links, and indexing to recommend optimisations. Manage Sitemap and Robots.txt for proper crawling and indexing. Support link-building to boost search visibility. Knowledge of YouTube SEO is a plus. Basic understanding of GA4 and GTM preferred.

Posted 1 day ago

Apply

6.0 years

0 Lacs

India

Remote

About Us MyRemoteTeam, Inc is a fast-growing distributed workforce enabler, helping companies scale with top global talent. We empower businesses by providing world-class software engineers, operations support, and infrastructure to help them grow faster and better. Job Title: Senior Java Spring Boot Developer Experience: 6+ Years Location: Pune - Hybrid Work mode - 2 days WFO/week Job Description: We are seeking an experienced Senior Java Spring Boot Developer with 6+ years of hands-on experience in building scalable, high-performance microservices using Java, Spring Boot, and Spring JPA. The ideal candidate will have strong expertise in designing and developing RESTful APIs, microservices architecture, and cloud-native applications. As part of our team, you will work on enterprise-grade applications, collaborate with cross-functional teams, and contribute to the full software development lifecycle. Mandatory Skills: ✔ 6+ years of Java development (Java 8/11/17). ✔ Strong Spring Boot & Spring JPA experience. ✔ Microservices architecture (design, development, deployment). ✔ RESTful API development & integration. ✔ Database expertise (SQL/NoSQL – PostgreSQL, MySQL, MongoDB). ✔ Testing frameworks (JUnit, Mockito). ✔ Agile methodologies & CI/CD pipelines. Key Responsibilities: Design & Development: Develop high-performance, scalable microservices using Spring Boot. Design and implement RESTful APIs following best practices. Use Spring JPA/Hibernate for database interactions (SQL/NoSQL). Implement caching mechanisms (Redis, Ehcache) for performance optimization. Microservices Architecture: Build and maintain cloud-native microservices (Docker, Kubernetes). Integrate with message brokers (Kafka, RabbitMQ) for event-driven systems. Ensure fault tolerance, resilience, and scalability (Circuit Breaker, Retry Mechanisms). Database & Performance: Optimize database queries (PostgreSQL, MySQL, MongoDB). Implement connection pooling, indexing, and caching strategies. Monitor and improve application performance (JVM tuning, profiling). Testing & Quality Assurance: Write unit & integration tests (JUnit, Mockito, Test Containers). Follow TDD/BDD practices for robust code quality. Perform code reviews and ensure adherence to best practices. DevOps & CI/CD: Work with Docker, Kubernetes, and cloud platforms (AWS/Azure). Set up and maintain CI/CD pipelines (Jenkins, GitHub Actions). Automate deployments and monitoring (Prometheus, Grafana). Collaboration & Agile: Work in Agile/Scrum teams with daily standups, sprint planning, and retrospectives. Collaborate with frontend, QA, and DevOps teams for seamless delivery.

Posted 1 day ago

Apply

6.0 years

0 Lacs

Mysore, Karnataka, India

Remote

About Us MyRemoteTeam, Inc is a fast-growing distributed workforce enabler, helping companies scale with top global talent. We empower businesses by providing world-class software engineers, operations support, and infrastructure to help them grow faster and better. Job Title: Senior Java Spring Boot Developer Experience: 6+ Years Location: Mysore - Hybrid Work Model - 2 days WFO Job Description: We are seeking an experienced Senior Java Spring Boot Developer with 6+ years of hands-on experience in building scalable, high-performance microservices using Java, Spring Boot, and Spring JPA. The ideal candidate will have strong expertise in designing and developing RESTful APIs, microservices architecture, and cloud-native applications. As part of our team, you will work on enterprise-grade applications, collaborate with cross-functional teams, and contribute to the full software development lifecycle. Mandatory Skills: ✔ 6+ years of Java development (Java 8/11/17). ✔ Strong Spring Boot & Spring JPA experience. ✔ Microservices architecture (design, development, deployment). ✔ RESTful API development & integration. ✔ Database expertise (SQL/NoSQL – PostgreSQL, MySQL, MongoDB). ✔ Testing frameworks (JUnit, Mockito). ✔ Agile methodologies & CI/CD pipelines. Key Responsibilities: Design & Development: Develop high-performance, scalable microservices using Spring Boot. Design and implement RESTful APIs following best practices. Use Spring JPA/Hibernate for database interactions (SQL/NoSQL). Implement caching mechanisms (Redis, Ehcache) for performance optimization. Microservices Architecture: Build and maintain cloud-native microservices (Docker, Kubernetes). Integrate with message brokers (Kafka, RabbitMQ) for event-driven systems. Ensure fault tolerance, resilience, and scalability (Circuit Breaker, Retry Mechanisms). Database & Performance: Optimize database queries (PostgreSQL, MySQL, MongoDB). Implement connection pooling, indexing, and caching strategies. Monitor and improve application performance (JVM tuning, profiling). Testing & Quality Assurance: Write unit & integration tests (JUnit, Mockito, Test Containers). Follow TDD/BDD practices for robust code quality. Perform code reviews and ensure adherence to best practices. DevOps & CI/CD: Work with Docker, Kubernetes, and cloud platforms (AWS/Azure). Set up and maintain CI/CD pipelines (Jenkins, GitHub Actions). Automate deployments and monitoring (Prometheus, Grafana). Collaboration & Agile: Work in Agile/Scrum teams with daily standups, sprint planning, and retrospectives. Collaborate with frontend, QA, and DevOps teams for seamless delivery.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

Remote

Job Title-AI/ML Engineering(Python,AWS,AI) || 5 to 8 Years || Remote Role Overview: Lead design and delivery of LLM-powered, agentic AI solutions with robust RAG pipelines and prompt-engineering best practices. Key Responsibilities: Integrate and optimize LLM APIs (OpenAI, FILxGPT ) for scalable services Craft, test and refine high-impact prompts and chains Architect agentic workflows combining retrieval, LLM generation and external tools Build end-to-end RAG systems: data ingestion, embedding/indexing, hybrid retrieval, grounded generation Ensure model governance: versioning, monitoring, bias/security audits Qualifications: 5+ years in AI/ML engineering with 2+ years on LLM/agentic systems Deep expertise in prompt engineering, RAG architectures and vector databases Proficient in Python, cloud (AWS Preferred ), Docker/Kubernetes and CI/CD

Posted 1 day ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

Remote

SystemsPlus is hiring for SEO. Exp : 8 yr + Work mode : Remote Work time : 3 pm to 1 am. Immidate joiners only. Role: Web Content and SEO Specialist We’re looking for a highly motivated Web Content and SEO Specialist who combines strong technical SEO skills with onsite . This role plays a critical part in improving organic search visibility, ensuring web content is engaging and optimized, and supporting broader digital marketing goals. You’ll work closely with developers, designers, and marketers to ensure our website is both search-engine and user-friendly. Location: Remote Responsibilities: On-Page & Technical SEO Conduct comprehensive keyword research and apply findings to content and metadata. Optimize title tags, meta descriptions, headers, internal linking structures, and image alt text. Perform ongoing SEO audits and implement technical improvements including: Site speed and performance optimization Mobile-first design checks and responsive UX Structured data/schema markup implementation Crawlability and indexing enhancements XML sitemap and robots.txt management Canonicalization and duplicate content fixes Redirects (301/302) and broken link fixes Monitor site health using tools like Google Search Console, Screaming Frog, Ahrefs, SEMrush, and Sitebulb. Website Management Manage and update website content using CMS platforms (WordPress, Drupal, etc.). Maintain web accessibility standards and adhere to SEO and UX best practices. Coordinate with developers on SEO-related technical changes and implementations. Analytics & Reporting Track performance of organic search traffic, keyword rankings, and content engagement using Google Analytics and SEO platforms. Deliver regular insights and reports to stakeholders with actionable recommendations. Stay up to date with search engine algorithm changes, SEO trends, and web technologies. Qualifications: Bachelor’s degree in Marketing, Communications, Computer Science, or related field 6-8 years of experience in content-driven SEO and technical SEO roles Strong knowledge of HTML, CSS, and website architecture as it relates to SEO Experience with SEO tools such as Google Analytics, Search Console, SEMrush, Ahrefs, Screaming Frog, JIRA, etc. Proficiency with CMS platforms (e.g., WordPress, Drupal, etc.) Proficiency in eCommerce platforms (SFCC, homegrown platform) Solid understanding of Google’s algorithm updates, Core Web Vitals, and mobile optimization Excellent communication, writing, and project management skills Nice to Have: Familiarity with JavaScript frameworks (React, Angular) and how they affect SEO Experience with A/B testing tools and CRO strategies Knowledge of international SEO and multilingual site best practices Previous experience in e-commerce or SaaS environments Interested candidate, please share your profiles on khyati.sagar@systems-plus.com

Posted 1 day ago

Apply

1.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description We are seeking curious and detail-oriented Content Analysts (Legal Editors) who are passionate about content creation , research , and diving deep into subject matter to understand how things work. This role is ideal for individuals who enjoy exploring legal topics, creating market-relevant content, and continuously expanding their knowledge. If you’re someone who embraces challenges and takes pride in delivering accurate, high-quality work, we’d love to have you on our team! Experience: Minimum 1 Year Job Location: Noida Sector-1 Education: Law Degree from a Reputed College & University Key Responsibilities Content Understanding & Market Relevance : Develop a thorough understanding of legal content and identify what is relevant for publication on our platform. Stay updated on legal and industry trends to ensure content aligns with current market demands. Judgment Analysis & Categorization : Analyze High Court and Tribunal judgments to extract key legal aspects, issues, and sections. Categorize judgments by subject, legal issues, industry marking, and disposition. Bare Acts & Legislative Analysis: Analyze statutes, amendments, and parliamentary bills to extract key provisions, objectives, and affected laws. Track and summarize legislative developments from Parliament sessions. Classify acts and bills by subject, impact area, sectoral relevance, and legislative status. Content Research & Updates : Conduct in-depth research to enhance and update content collections across various legal domains. Ensure content is accurate, insightful, and error-free for publishing. Editorial Excellence : Perform detailed editorial tasks, including indexing, sequencing, and quality assurance of acts, rules, and judgments. Work collaboratively to adhere to structured workflows for creating, editing, and publishing content. Note: The responsibilities mentioned are not limited to the above and may include additional tasks as needed, as the role requires a multitasker who can adapt to the ever-evolving and dynamic nature of the work. Desired Skills & Experience A natural curiosity to delve deep into legal subjects and learn how they work. Strong legal research and analytical skills. Awareness of market trends and a keen interest in aligning content with audience needs. Proficiency in Microsoft Word , Adobe Acrobat , and effective internet research. Excellent command of English with impeccable attention to detail.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

Remote

Role - Python Developer Skill- AWS, AI Exp- 5-10 YRS || Remote JD: Data Engineer – Python + AWS + AI - Job Description Exp Level – 5 to 8 years Location – Gurgaon & Remote Key Responsibilities: Integrate and optimize LLM APIs (OpenAI, FILxGPT ) for scalable services Craft, test and refine high-impact prompts and chains Architect agentic workflows combining retrieval, LLM generation and external tools Build end-to-end RAG systems: data ingestion, embedding/indexing, hybrid retrieval, grounded generation Ensure model governance: versioning, monitoring, bias/security audits Qualifications: 5+ years in AI/ML engineering with 2+ years on LLM/agentic systems Deep expertise in prompt engineering, RAG architectures and vector databases Proficient in Python, cloud (AWS Preferred ), Docker/Kubernetes and CI/CD DM Resume @ soumya.srivastava@innovaesi.com

Posted 1 day ago

Apply

1.5 years

0 Lacs

Mohali district, India

On-site

Job Title: SEO Executive(Night shift ) Company: Aspire Globus Location: Quark Atrium, Phase 8B, Mohali, Punjab Job Type: Full-Time | Onsite | Night Shift (5:30 PM – 2:30 AM) | Mon–Fri We’re Hiring! Aspire Globus is looking for a skilled SEO Executive with 1.5+ years of experience to manage end-to-end SEO activities, including on-page, off-page, and technical SEO. Key Responsibilities: Conduct on-page audits and implement SEO best practices Build quality backlinks through outreach and content strategies Fix technical SEO issues (crawl errors, indexing, etc.) Track keyword rankings, traffic, and performance metrics Generate actionable SEO reports Requirements: 1.5+ years of hands-on SEO experience Strong knowledge of tools: GSC, GA4, Ahrefs, SEMrush, etc. Familiarity with WordPress & basic HTML/CSS Strong communication & analytical skills

Posted 1 day ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office. Who We Are Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description Job Description: A highly skilled Database Engineer with 6 years of experience in designing, implementing, and maintaining complex database systems. Proven expertise in performance tuning, data modeling, and ensuring data integrity. Adept at collaborating with cross-functional teams to optimize database functionality and support business objectives. What You'll Do Design, develop, and maintain database architectures, ensuring optimal performance and security. Implement data models and database structures that meet the needs of applications and reporting. Perform database performance tuning, indexing, and query optimization. Manage database backup and recovery processes to prevent data loss. Ensure compliance with data governance policies and security standards. Collaborate with software developers, system architects, and data analysts to support application development and data needs. Monitor database performance and troubleshoot issues to maintain high availability and reliability. Conduct regular database audits and implement improvements based on findings. Stay updated with the latest database technologies and best practices. Skills And Technologies Proficient in SQL, PL/SQL and experience with relational database management systems (RDBMS) such as Oracle, MySQL, PostgreSQL, or SQL Server. Knowledge of NoSQL databases (e.g., MongoDB, Cassandra) and data warehousing solutions. Experience with database migration and upgrade processes. Familiarity with cloud database services (e.g., AWS RDS, Azure SQL Database). Strong understanding of data security practices and regulatory compliance. Scripting skills in languages like Python, Perl, Shell, or PowerShell for automation. Excellent analytical and problem-solving skills. What You Need To Bring Bachelor’s degree in computer science, Information Technology, or a related field. Relevant certifications (e.g., Microsoft Certified: Azure Database Administrator, Oracle Certified Professional) are a plus. Experience 4-6 years of experience in database engineering or a related field. Proven track record of successfully managing large-scale database projects. Soft Skills Strong communication skills for collaboration with technical and non-technical stakeholders. Detail-oriented with a focus on quality and accuracy. Ability to work independently and manage multiple priorities effectively Additional Skills Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Security-First Mindset, Solutions Design, Testing & Automation, User Experience (UX) What We Can Offer You Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #networking Job Engineering Job Level TCP_02 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.

Posted 1 day ago

Apply

6.0 years

0 Lacs

India

Remote

Job Title: Marketing Technologist (Full Stack) About the Role CurrentWare is scaling rapidly — and we need a technical growth enabler to keep our marketing infrastructure blazing fast, AI-visible, and automation-ready. As our Marketing Technologist (Full Stack), you’ll work across performance engineering, CRM pipelines, analytics, and emerging AI visibility tactics to ensure our site and stack are optimized for growth. You’ll collaborate with a high-performance team of content marketers, designers, SEO experts, and email specialists to accelerate lead generation, experiment velocity, and digital experience quality. Key Responsibilities Website Performance & Core Web Vitals Optimize WordPress for Core Web Vitals (LCP, CLS, TTFB) Tune hosting (Rocket.net, WPEngine), caching (object/page), and CDN rules (Cloudflare) Set up and maintain monitoring (PageSpeed Insights, Core web vitals) Proactively fix and reduce load/interaction times across mobile and desktop Technical SEO & AI/LLM Visibility Implement structured data (FAQ, HowTo, Article, Breadcrumbs) Manage sitemap health, canonical URLs, hreflang, and robots.txt Enable content indexing in LLMs like ChatGPT, Bing Copilot, Perplexity, Claude and so on Explore AI-first SEO via page summaries, OpenAI feeds, schema metadata enrichment CRM, Attribution & Stack Integration Manage Zoho and HubSpot syncing, custom fields, lead scoring, and lifecycle tagging Build automations using Zapier/Make and connect tools like rb2b, etc. Ensure robust attribution via UTMs, cookies, hidden fields, and form tracking Debug CRM sync issues, attribution leaks, and webhook failures Analytics, Tagging & Reporting Own Google Tag Manager and GA4 implementation Create and manage custom dataLayer variables for consistent tracking Track and report on CTA usage, form abandonments, funnel progression, and traffic quality Build Looker Studio dashboards that tie traffic to pipeline Campaign Support & UX Collaboration Develop fast, responsive landing pages with A/B testing capability Build reusable components and mini-apps for experiments, quizzes, tools QA live pages across devices, breakpoints, and user flows Security, Maintenance & DevOps Manage backup workflows, staging to production deployments, and uptime monitoring Enforce SSL, WAF, CSP headers, and plugin-level security best practices Identify and resolve conflicts across plugins, scripts, or theme updates Advanced Responsibilities (Nice-to-Haves) Server-side tagging (GTM server container, first-party tracking setups) Reverse-IP-based personalization (e.g. via rb2b + CRM merge tags) Content feed optimization for retrieval-augmented generation (RAG) in AI tools Hybrid/headless CMS migration planning (e.g., blog → Sanity or Next.js) Required Experience 3–6 years working across WordPress, marketing tech, and integrations Expert in HTML, CSS, JS, PHP — with performance debugging skills Familiar with CRMs (Zoho, HubSpot), GTM, GA4, Cloudflare, Looker Studio Experience with APIs, webhooks, and martech automation platforms Bonus Skills Experience with AI/LLM SEO visibility strategies Familiarity with rb2b or similar tools Built or maintained large-scale content-driven sites or headless frontends Exposure to SEO automation, content injection, or programmatic publishing Remote or hybrid (India-based candidates preferred) Collaboration You’ll work closely with: Content Marketer – for SEO-optimized publishing and LLM readiness Email Marketer – for CRM syncing, lead tracking, and campaign flows Technical SEO + Junior Dev – to implement and scale ranking strategies UI Designer – to execute high-conversion landing pages and experiments 📩 To Apply Send your resume, portfolio, and any past site or SEO performance metrics you’ve influenced. We’re looking for sharp executors who enjoy solving messy growth problems with clean code, fast load speeds, and clever automation.

Posted 1 day ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Junior .NET Developer Location: Chennai, IN (5 days onsite) Shift Timing: 2 PM – 11 PM (UK shift) Work Type: Onsite | Full-Time | Permanent Minimum Experience: 3 years Roles and responsibilities: 3+ years of experience with .NET Core, C# and SQL Server Experience building Web APIs and MVC applications using ASP.NET Core Strong knowledge of Windows desktop applications using WinForms or WPF Familiar with Entity Framework Core and writing LINQ queries Good understanding of SQL Server – stored procedures, indexing, and query tuning Experience with Git, Bitbucket version control tools. Experience with JIRA, Confluence. Basic knowledge of HTML, CSS, JavaScript, or Razor Pages Understanding of RESTful APIs, WCF SOAP services and integration with external systems Familiarity with unit testing frameworks like xUnit or NUnit Good grasp of OOPS, SOLID principles, and common design patterns Optional: Exposure to Angular, Azure, CI/CD tools Dexian is a leading provider of staffing, IT, and workforce solutions with over 12,000 employees and 70 locations worldwide. As one of the largest IT staffing companies and the 2nd largest minority-owned staffing company in the U.S., Dexian was formed in 2023 through the merger of DISYS and Signature Consultants. Combining the best elements of its core companies, Dexian's platform connects talent, technology, and organizations to produce game-changing results that help everyone achieve their ambitions and goals. Dexian's brands include Dexian DISYS, Dexian Signature Consultants, Dexian Government Solutions, Dexian Talent Development and Dexian IT Solutions. Visit https://dexian.com/ to learn more. Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.

Posted 1 day ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

Remote

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Responsibilities Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Preferred Skills (Nice to Have) Experience with ETL processes and data warehousing. Knowledge of cloud-based databases (AWS RDS, Google BigQuery, Azure SQL). Familiarity with database performance tuning and indexing strategies. Exposure to Python or other scripting languages for database automation. Experience with business intelligence (BI) tools like Power BI or Tableau. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM

Posted 1 day ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experince Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM

Posted 1 day ago

Apply

47.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Senior Full Stack Developer (.NET + SQL + Angular + AWS) Experience : 47 Years Location : Mumbai Kalina, Santacruz Employment Type : Full-Time Notice Period : Immediate to 7 Days Industry Preference : Candidates with finance or banking domain experience will be Overview : We are seeking a Senior Full Stack Developer with strong backend expertise in .NET Core (C#) and SQL Server, along with working knowledge of Angular for frontend development. The ideal candidate should be capable of delivering high-performance, scalable backend services while also contributing to frontend components as needed. This role is best suited for someone who thrives in a fast-paced environment, has experience working with large datasets, and can handle full-stack responsibilities with a backend-first mindset. Key Responsibilities Design, develop, and maintain scalable backend services using .NET Core / C#. Create, manage, and optimize complex SQL queries, stored procedures, triggers, and indexing strategies. Develop RESTful APIs and integrate with internal and external systems. Collaborate with frontend developers to build and maintain UI components using Angular (v10+). Ensure application performance, reliability, and security. Contribute to cloud deployment and architecture on AWS. Work closely with Product Owners and Business Analysts to translate business requirements into technical solutions. Conduct unit testing, participate in peer code reviews, and assist in system deployments. Maintain comprehensive technical documentation of systems and Skills & Qualifications : 47 years of experience in .NET Core / C# backend development. Strong hands-on experience with SQL Server (query optimization, indexing, performance tuning). Experience building and consuming RESTful APIs. Good understanding of Angular (v10+) and TypeScript. Working knowledge of AWS services like EC2, S3, Lambda, RDS (or equivalent experience on Azure/GCP). Familiarity with CI/CD pipelines, Git, and code versioning practices. Solid grasp of OOP principles, design patterns, and clean coding standards. Exposure to Agile/Scrum development methodologies. Excellent problem-solving and debugging skills. Strong verbal and written communication skills. Nice To Have Experience in finance or banking domain. Exposure to microservices architecture. Knowledge of containerization (Docker, Kubernetes). Experience with reporting tools like SSRS or Power BI. Understanding of automated testing frameworks and test-driven development. (ref:hirist.tech)

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Introduction A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. In this role, you will work for IBM BPO, part of Consulting that, accelerates digital transformation using agile methodologies, process mining, and AI-powered workflows. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities As a Deputy Manager – Procure to Pay (P2P), you are responsible for invoice processing, vendor master management, Query resolution, indexing and Invoice reconciliation. You should be flexible to work in shifts. Your Primary Responsibilities Include Involved in Vendor master creation, changes, verification, and cleansing. Identify the duplicate records for the Vendor Master Maintenance Invoice receipt, verification, and processing accurately. Prioritize processing of urgent/ageing invoices. Recording of Invoices both Purchase Order Based and Non-Purchase Order Based (Un-supported Invoices). Coordination with various stakeholders, obtaining coding, approval and resolving issues around blocked invoices. Ensuring payment and expense entries are posted in accounting software on a timely basis. You will handle manual and automatic payment requests. Processing of travel and expense claims, payments, duplicate payment resolution and recovery and verifying and running payment proposals. Would be involved in handling queries for vendor statement reconciliation through calls and emails. Adhere to client SLAs (Service Level Agreements) and timelines. Preferred Education Master's Degree Required Technical And Professional Expertise Commerce Graduate with a minimum of 8+ Years of experience in Accounts Payable. Experience in invoice and vendor management along with resolving queries, and Invoice reconciliation. Proven work knowledge to manage payment reporting and reconciliation activities. Preferred Technical And Professional Experience Proficient in MS Office applications. Ambitious individual who can work under their direction towards agreed targets/goals. Ability to work under tight timelines and be part of change management initiatives. Proven interpersonal skills while contributing to team effort by accomplishing related results as needed. Enhance technical skills by attending educational workshops, reviewing publications etc.

Posted 2 days ago

Apply

5.0 - 31.0 years

3 - 3 Lacs

Work From Home

Remote

Responsibilities: Develop custom WordPress themes/plugins with emphasis on Advanced Custom Fields (ACF) implementations Build and manage complex custom post types, taxonomies, and field groups Optimize MySQL queries and WordPress database performance Implement front-end functionality using vanilla JavaScript/jQuery Migrate/refactor legacy WordPress implementations (Bonus) Assist with Shopify theme customization and Liquid templating Core Requirements: 3+ years professional WordPress/PHP development Expert-level ACF implementation (repeaters, flexible content, options pages) Strong understanding of WordPress hooks (actions/filters) and security best practices MySQL optimization (indexing, query analysis) JavaScript fundamentals (ES6+, DOM manipulation, AJAX) Preferred Qualifications: Experience with headless WordPress (REST API/WPGraphQL) Shopify/Liquid template development experience Familiarity with modern build tools (Webpack, Vite) Knowledge of React/Vue for custom blocks or admin interfaces

Posted 2 days ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Qualifications  5+ years of software development experience  Strong development skills in Java JDK 1.8 or above  Java fundamentals like Exceptional handling, Serialization/Deserialization and Immutability concepts  Good fundamental knowledge in Enums, Collections, Annotations, Generics, Auto boxing and Data Structure  Database RDBMS/No SQL (SQL, Joins, Indexing)  Multithreading (Re-entrant Lock, Fork & Join, Sync, Executor Framework)  Spring Core & Spring Boot, security, transactions  Hands-on experience with JMS (ActiveMQ, RabbitMQ, Kafka etc)  Memory Mgmt (JVM configuration, Profiling, GC), profiling, Perf tunning, Testing, Jmeter/similar tool)  Devops (CI/CD: Maven/Gradle, Jenkins, Quality plugins, Docker and containersization)  Logical/Analytical skills. Thorough understanding of OOPS concepts, Design principles and implementation of different type of Design patterns.  Hands-on experience with any of the logging frameworks (SLF4J/LogBack/Log4j)  Experience of writing Junit test cases using Mockito / Powermock frameworks. Should have practical experience with Maven/Gradle and knowledge of version control systems like Git/SVN etc.  Good communication skills and ability to work with global teams to define and deliver on projects. Sound understanding/experience in software development process, test-driven development.  Cloud – AWS / AZURE  Experience in Microservice

Posted 2 days ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experince Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. 2) Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). 6) Take part in evaluation of new data tools, POCs and provide suggestions. 7) Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. 8) Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417810 Relocation Package Yes

Posted 2 days ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. 2) Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). 6) Take part in evaluation of new data tools, POCs and provide suggestions. 7) Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. 8) Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417809 Relocation Package Yes

Posted 2 days ago

Apply

0 years

0 Lacs

India

Remote

Job Overview: We are seeking a skilled Backend Developer who specializes in Node.js and PostgreSQL to join our dynamic development team. You will be responsible for building and maintaining efficient server-side application logic, designing robust database schemas, and ensuring high performance and availability of backend services. Key Responsibilities: Develop and maintain server-side applications using Node.js. Design, implement, and optimize database schemas and complex SQL queries in MySQL for data storage and retrieval. Build and manage secure, scalable, and high-performance RESTful APIs to connect with the frontend and third-party services. Collaborate closely with frontend developers, product managers, and other stakeholders to ensure seamless integration of user-facing elements. Implement data protection, authentication, and authorization protocols to ensure application security. Optimize applications for maximum speed and scalability. Troubleshoot, debug, and resolve server-side issues and bottlenecks. Conduct code reviews, unit tests, and maintain comprehensive documentation of backend services. Monitor backend performance and troubleshoot production issues as necessary. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent experience). Proven hands-on experience with Node.js, particularly frameworks like Express.js. Deep understanding of MySQL, including query optimization, joins, indexing, and database normalization. Advanced proficiency in JavaScript (ES6+) and asynchronous programming fundamentals. Experience developing, integrating, and maintaining RESTful APIs. Nice-to-Have Skills: Familiarity with ORMs (e.g., Sequelize, Knex.js) and raw SQL when required. Knowledge of containerization (Docker), caching (Redis), or cloud platforms (AWS, GCP) is a plus. Why Join Us? Work on global projects with a highly collaborative team. Opportunities for professional growth, learning new technologies, and taking on leadership responsibilities. Flexible working hours and remote work options are available. Competitive compensation and a supportive work environment.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In SAP technology at PwC, you will specialise in utilising and managing SAP software and solutions within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of SAP products and technologies. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. SAP Native Hana Developer Technical Skills Bachelor's or Master's degree in a relevant field (e.g., computer science, information systems, engineering). Minimum of 3 years of experience in HANA Native development and configurations, including at least 1 year with SAP BTP Cloud Foundry and HANA Cloud. Demonstrated experience in working with various data sources SAP(SAP ECC, SAP CRM, SAP S/4HANA) and non-SAP (Oracle, Salesforce, AWS) Demonstrated expertise in designing and implementing solutions utilizing the SAP BTP platform. Solid understanding of BTP HANA Cloud and its service offerings. Strong focus on building expertise in constructing calculation views within the HANA Cloud environment (BAS) and other supporting data artifacts. Experience with HANA XS Advanced and HANA 2.0 versions. Ability to optimize queries and data models for performance in SAP HANA development environment and sound understanding of indexing, partitioning and other performance optimization techniques. Proven experience in applying SAP HANA Cloud development tools and technologies, including HDI containers, HANA OData Services , HANA XSA, strong SQL scripting, SDI/SLT replication, Smart Data Access (SDA) and Cloud Foundry UPS services. Experience with ETL processes and tools (SAP Data Services Preferred). Ability to debug and optimize existing queries and data models for performance. Hands-on experience in utilizing Git within Business Application Studio and familiarity with Github features and repository management. Familiarity with reporting tools and security based concepts within the HANA development environment. Understanding of the HANA Transport Management System, HANA Transport Container and CI/CD practices for object deployment. Knowledge of monitoring and troubleshooting techniques for SAP HANA BW environments. Familiarity with reporting tools like SAC/Power BI building dashboards and consuming data models is a plus. HANA CDS views: (added advantage) Understanding of associations, aggregations, and annotations in CDS views. Ability to design and implement data models using CDS. Certification in SAP HANA or related areas is a plus Functional knowledge of SAP business processes (FI/CO, MM, SD, HR).

Posted 2 days ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

Remote

About This Role Team Overview BlackRock SMA Solutions helps clients customize portfolios for unique tax, values-alignment, or investment exposures across direct indexing, fixed income, active equity, and multi-asset. We deliver world-class service to all of our clients, from wirehouses and wealth advisors to family offices to endowments and foundations. About The Role We’re growing our team and seeking a Portfolio Support Associate to partner with portfolio managers and specialists in driving operational excellence. This non-client-facing role is ideal for someone who thrives in a fast-paced, detail-oriented environment and is eager to contribute to the efficiency and performance of customized investment portfolios. You’ll play a key role in transforming data, automating workflows, and supporting the evolution of our analytical infrastructure. The ideal candidate is intellectually curious, organized, and collaborative—with a keen sense for problem-solving and a sense of humor that fits with our dynamic team culture. Key Responsibilities Clean and structure raw client data into actionable formats to support investment analysis and implementation. Manage third-party investment portals, ensuring accurate and timely submission of portfolio data and completion of requests. Maintain Salesforce-based dashboards and workflows to track portfolio requests and operational tasks. Build and maintain ad hoc automation tools using Python, SQL, and/or VBA to streamline reporting and analysis. Support the development of deeper portfolio analytics as systems and data infrastructure evolve. Contribute to ongoing process improvements and perform other duties within scope as needed. Qualifications Bachelor’s degree required. Minimum 3 years of relevant experience in finance, operations, and/or data analytics. Exceptional attention to detail and accuracy. Strong written and verbal communication skills. Proven problem-solving and interpersonal abilities. Highly organized with the ability to multitask and adapt quickly. Comfortable working independently and within small teams. High integrity and discretion when handling sensitive data. Team-oriented mindset with a proactive approach. Familiarity with Salesforce is a plus. Experience in trade operations is a plus. Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.

Posted 2 days ago

Apply

0 years

1 - 2 Lacs

Hyderābād

On-site

Job Description: You should be good with your fingers and your basic typing speed must be 36- 40 WPM, with 90% accuracy. You will be maintaining the database of articles being received by the review department. You are responsible for detecting and correcting errors in written documents. One must be decent and proficient in English reading skills, also understanding skills. The job involves checking written text for misspellings and inaccuracies before publishing. You need to develop and maintain a good reputation with the journal stakeholders (editors/authors/readers) by pacifying the Reviewer comments in the evaluation form submitted by the reviewers on the article assigned for peer-reviewing by the concerned department. You also need to make copies of proofs for editors, authors, and others to revise. Job aspirants must possess the knowledge of the publication process and system, such that he/she should interpret proper resolution to the query being raised by the prospective. They should be mindful of the indexing, archiving, and search engine sites related to scientific publications such that they need to collect the potential author's bio which should contain contact information, research work-present/past, research interests, and previous publishing history, etc. You should be able to handle queries from different nationalities and from various fields. One should be experienced in client/customer responses, basic presentation, publicizing, promoting, etc. skills and techniques. Job aspirants must possess the knowledge of the publication process and system, such that he/she can interpret proper resolution to the query being raised by the prospective. They should be mindful of the indexing, archiving, and search engine sites specifically related to scientific publications such that they need to collect the potential author's bio which should contain contact information, contact details, previous publishing history, etc. Job aspirants should be in continuous touch with the tele-caller department and always try to get the article from the negative/positive queries we receive. They also should possess the necessary skill set involving the Management of Information Systems. On article submission, you will be the first individual to screen/scrutinize the submissions and in forwarding them to the review department. You will be responsible for generating the manuscript number for the submitted article from the proprietary panel that will be assigned to you. You need to collect a database of scholars from all over the world and contact them through e-mail for the article review process. You will have to process the article for publication within a given period and you should always run ahead of time. You need to develop and maintain a good reputation with the journal stakeholders (editors/authors/readers) via email communication and sometimes through verbal communication in association with our tele-caller department. You are also responsible for intimating the author about the evaluation form sent by the reviewers and requesting the revised article. You will be in continuous contact with the web development team to get the revised article published online on our websites in all forms of e-printing media (PDF, Full-text, Html, etc.). You need to provide guidance and timely status information to all stakeholders (editors/reviewers/authors) for all articles from submission to publication stages. You need to develop contacts and assist in collaborating/associating the company with different universities/ institutions around the world. Qualifications: The candidate should be Professional post-graduates in any one of the following streams : Physics,Chemistry, Mathematics,Life Sciences,Biochemistry,Biotechnology,Pharmacy and other allied streams. Key skills: Excellent command over English- writing and reading skills. Ability to recognize inconsistencies. Capability of identifying poorly written articles. pls send resumes to hr@ppploa.com with CTC and notice period pls note: Only hyd or nearby located candidates must apply as it's a work from office role . Thanks HR Dept Job Type: Full-time Pay: ₹15,000.00 - ₹18,000.00 per month Benefits: Health insurance Paid sick time Provident Fund Schedule: Day shift

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies