Jobs
Interviews

901 Schema Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

8 - 12 Lacs

Chennai

Work from Office

Sr Application Java developer with strong technical knowledge and 6 to 10 years of experience in designing, developing and supporting web based applications using Java technologies. Candidate must have strong experience in Java, J2EE, JavaScript, APIs, Microservices, building APIs and SQL along with excellent verbal and written communication skills. The candidate should have a good experience in developing APIs with the expected output structure and high performance. Should be experienced in implementing APIs based on enterprise-level architecture frameworks and guidelines Writing well designed, testable, efficient Backend, Middleware code and building APIs using Java (e.g. Hibernate, Spring) Strong experience in designing and developing high-volume and low-latency REST APIs especially based on relational databases such as SQL Server Should be able to build API from scratch based on traditional DB and provide JSON output in requested structure Develop technical designs for application development/Web API Conducting software analysis, programming, testing, and debugging Designing and implementing relational schema in Microsoft SQL, and Oracle. Debugging application/system errors on development, QA and production systems

Posted 1 month ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Chennai

Work from Office

Job Type: Full Time Key Responsibilities: Develop reusable, typed frontend components using hooks and modern state management patterns. Ensure responsive UI/UX and cross-browser compatibility. Design RESTful or GraphQL APIs using Express and TypeScript. Model relational schemas and write optimized SQL queries and stored procedures. Optimize database performance using indexes, partitions, and EXPLAIN plans. Write unit and integration tests using the Jest and React Testing Library. Participate actively in code reviews and maintain coding standards. Required Skills React.js with TypeScript (React 16+ with functional components and hooks) Node.js with TypeScript and Express MySQL (schema design, normalization, indexing, query optimization, stored procedures) HTML5, CSS3/Sass, ECMAScript 6+ Git, npm/yarn, Webpack/Vite, ESLint/Prettier, Swagger/OpenAPI Jest, React Testing Library

Posted 1 month ago

Apply

0.0 - 1.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Job Title : Management Trainee - Digital Marketing Experience : 0-1 years Location : Hyderabad This is a learning-intensive role perfect for someone with foundational SEO training looking to gain real-world experience under expert guidance. This role will involve hands-on learning and support across key SEO tasks while assisting in social media, email marketing, and research. An ideal fitment would be for someone who has foundational knowledge of SEO principles, preferably from a digital marketing certification course, and a keen interest in developing technical SEO skills. Key Responsibilities: SEO (Primary Focus - Technical & Strategic Support) Assist in maintaining and updating SEO Standard Operating Procedures (SOPs) and performance reports. (Training will be provided, but prior exposure is a plus.) Support the execution of on-page and off-page SEO strategies, including backlinking, keyword optimization, and content improvements. (Practical guidance will be given.) Learn and apply Technical SEO fundamentals (crawling, indexing, site audits, schema markup, etc.). (Prior course knowledge is required.) Utilize basic HTML for SEO optimizations (meta tags, headers, etc.). (Hands-on experience will be developed.) Work with tools like Google Analytics, Search Console, and SEO platforms under supervision. Social Media (Secondary Support) Assist in publishing organic social media content and engage with audience interactions. Email Marketing (Support Role) Help generate and organize lead databases for cold email outreach campaigns. Research & Outreach Conduct vendor/partner research and assist in scheduling introductory calls. Skills Required: Completion of an SEO or digital marketing certification from a recognized digital marketing institute. Basic understanding of SEO best practices (on-page, off-page, and technical SEO). Familiarity with HTML is a strong advantage. Strong research skills and attention to detail. Eagerness to learn and grow in digital marketing.

Posted 1 month ago

Apply

7.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Proficiency in SQL * Data modeling in both SQL and PowerBi Star and Snowflake Schema* Experience with integrating the solution from one BI tool to other BI tools* Solution design and architecture experience* Knowledge or experience in data engineering Extraction, Transformation, Load* Familiarity with data connectors Cloud & On-Premises Databases, Gateways, SharePoint* Expertise in Power BI transformations using Power Query Editor & M - Language* Strong visualization skills * Advanced DAX * Understanding of Row-Level Security (RLS) and other security measures * Knowledge of Power BI Service architecture (Workspaces, Apps, Schedule Refresh) * Capability in data loading and incremental refresh techniques * Experience with web embedding UI/UX knowledge Experience with embedding the PowerBi report in Power Apps. Experience in building Paginated report Min. 7+ years relevant experience in Powerbase. And not less than 7 years. Good to have. Power Platform Power Apps, Power Automate Experience Microsoft Fabric Knowledge MS BI SSRS

Posted 1 month ago

Apply

5.0 - 7.0 years

2 - 6 Lacs

Hyderabad

Work from Office

Experience : 5-7 Years hands-on experience in link building or off-page SEO Key Responsibilities: Conduct keyword research and develop optimization strategies aligned with business goals. Plan and execute ethical (white-hat) link building campaigns. Identify link opportunities through competitor backlink analysis and content gap research Perform outreach campaigns via email and LinkedIn to build relationships with webmasters, bloggers, and publishers. Maintain a diverse and balanced backlink profile by tracking domain authority, relevance, and anchor text distribution. Collaborate with the SEO and content teams to ensure link acquisition supports target pages, keywords, and business goals. Monitor new and lost backlinks, analyze impact, and report performance using Google Search Console and backlink monitoring tools. Execute local and geo-targeted link building strategies to strengthen regional SEO visibility Contribute to AEO efforts by promoting FAQ, schema-structured content, and snippet-worthy assets. Required Skills & Qualifications: Strong understanding of Google Search ranking signals and link quality metrics Proficiency with SEO tools : Ahrefs, Semrush, BuzzStream, Hunter.io, Google Search Console Excellent communication and email outreach skills Ability to analyze backlink profiles, identify toxic links, and manage disavow files Experience with local SEO and geo-specific link strategies is a plus Familiarity with Answer Engine Optimization (AEO) and SERP feature optimization preferred Ability to prioritize tasks, meet deadlines, and manage SEO initiatives independently. Exposure to local SEO, schema markup, and SERP feature optimization is a plus.

Posted 1 month ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP HCM Personnel Administration Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business needs and technical specifications. Your role will require effective communication and coordination to facilitate smooth project execution and delivery. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentoring within the team to enhance overall performance.- Monitor project progress and ensure adherence to timelines and quality standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM Personnel Administration.- Strong understanding of application design and development processes.- Experience with configuration and customization of SAP HCM modules.- Ability to analyze business requirements and translate them into technical specifications.- Familiarity with project management methodologies and tools. Additional Information:- The candidate should have minimum 5 years of experience in SAP HCM Personnel Administration.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP HCM Personnel Administration Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications align with business needs and technical specifications, while fostering a collaborative environment that encourages innovation and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM Personnel Administration.- Strong understanding of application design and development methodologies.- Experience with system integration and data migration processes.- Ability to analyze business requirements and translate them into technical specifications.- Familiarity with project management tools and methodologies. Additional Information:- The candidate should have minimum 5 years of experience in SAP HCM Personnel Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP HCM Personnel Administration Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business needs and technical specifications. Your role will require you to facilitate communication between stakeholders and the development team, ensuring that all parties are informed and engaged throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM Personnel Administration.- Strong understanding of application design and development methodologies.- Experience with system integration and data migration processes.- Ability to analyze business requirements and translate them into technical specifications.- Familiarity with project management tools and methodologies. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP HCM Personnel Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

6.0 - 10.0 years

6 - 7 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Experience: 6 + years Expected Notice Period: 15 Days Shift: (GMT+05:30) Asia/Kolkata (IST) Opportunity Type: Remote,New Delhi,Bengaluru,Mumbai Technical Lead What youll own Leading the re-architecture of Zooms database foundation with a focus on scalability, query performance, and vector-based search support Replacing or refactoring our current in-house object store and metadata database to a modern, high-performance elastic solution Collaborating closely with our core platform engineers and AI/search teams to ensure seamless integration and zero disruption to existing media workflows Designing an extensible system that supports object-style relationships across millions of assets, including LLM-generated digital asset summaries, time-coded video metadata, AI generated tags, and semantic vectors Driving end-to-end implementation: schema design, migration tooling, performance benchmarking, and production rolloutall with aggressive timelines Skills & Experience We Expect Were looking for candidates with 7-10 years of hands-on engineering experience, including 3+ years in a technical leadership role. Your experience should span the following core areas: System Design & Architecture (3-4 yrs) Strong hands-on experience with the Java/JVM stack (GC tuning), Python in production environments Led system-level design for scalable, modular AWS microservices architectures Designed high-throughput, low-latency media pipelines capable of scaling to billions of media records Familiar with multitenant SaaS patterns, service decomposition, and elastic scale-out/in models Deep understanding of infrastructure observability, failure handling, and graceful degradation Database & Metadata Layer Design (3-5 yrs) Experience redesigning or implementing object-style metadata stores used in MAM/DAM systems Strong grasp of schema-less models for asset relationships, time-coded metadata, and versioned updates Practical experience with DynamoDB, Aurora, PostgreSQL, or similar high-scale databases Comfortable evaluating trade-offs between memory, query latency, and write throughput Semantic Search & Vectors (1-3 yrs) Implemented vector search using systems like Weaviate, Pinecone, Qdrant, or Faiss Able to design hybrid (structured + semantic) search pipelines for similarity and natural language use cases Experience tuning vector indexers for performance, memory footprint, and recall Familiar with the basics of embedding generation pipelines and how they are used for semantic search and similarity-based retrieval Worked with MLOps teams to deploy ML inference services (e.g., FastAPI/Docker + GPU-based EC2 or SageMaker endpoints) Understands the limitations of recognition models (e.g., OCR, face/object detection, logo recognition), even if not directly building them Media Asset Workflow (2-4 yrs) Deep familiarity with broadcast and OTT formats: MXF, IMF, DNxHD, ProRes, H.264, HEVC Understanding of proxy workflows in video post-production Experience with digital asset lifecycle: ingest, AI metadata enrichment, media transformation, S3 cloud archiving Hands-on experience working with time-coded metadata (e.g., subtitles, AI tags, shot changes) management in media archives Cloud-Native Architecture (AWS) (3-5 yrs) Strong hands-on experience with ECS, Fargate, Lambda, S3, DynamoDB, Aurora, SQS, EventBridge Experience building serverless or service-based compute models for elastic scaling Familiarity with managing multi-region deployments, failover, and IAM configuration Built cloud-native CI/CD deployment pipelines with event-driven microservices and queue-based workflows Frontend Collaboration & React App Integration (2-3 yrs) Worked closely with React-based frontend teams, especially on desktop-style web applications Familiar with component-based design systems, REST/GraphQL API integration, and optimizing media-heavy UI workflows Able to guide frontend teams on data modeling, caching, and efficient rendering of large asset libraries Experience with Electron for desktop apps skills -MAM, App integration

Posted 1 month ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Kolhapur, Pune

Work from Office

Bluebenz Digitizations Private Limited is looking for Database Developer to join our dynamic team and embark on a rewarding career journey Developing database solutions to store and manage large amounts of data. Creating database schemas that represent and support business processes. Optimizing database performance by identifying and resolving issues with indexing, query design, and other performance-related factors. Developing and maintaining database applications and interfaces that allow users to access and manipulate data. A successful Database Developer should have strong technical skills, including proficiency in database design, programming languages, and SQL. They should have experience working with large and complex data sets and knowledge of relational databases and SQL. The developer should also have experience with database management systems, such as Oracle, MySQL, or SQL Server.

Posted 1 month ago

Apply

5.0 - 9.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Career Category Information Systems Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: We are seeking a Reference Data Sr Associate Engineer who as the Reference Data Product team member of the Enterprise Data Management organization, will be responsible for managing and promoting the use of reference data, partnering with business Subject Mater Experts on creation of vocabularies / taxonomies and ontologies, and developing analytic solutions using semantic technologies. Roles & Responsibilities: Work with Reference Data Product Owner, external resources and other engineers as part of the product team Develop and maintain semantically appropriate concepts Identify and address conceptual gaps in both content and taxonomy Maintain ontology source vocabularies for new or edited codes Support product teams to help them leverage taxonomic solutions Analyze the data from public/internal datasets. Develop a Data Model/schema for taxonomy. Create a taxonomy in Semaphore Ontology Editor. Perform Bulk-import data templates into Semaphore to add/update terms in taxonomies. Prepare SPARQL queries to generate adhoc reports. Perform Gap Analysis on current and updated data Maintain taxonomies in Semaphore through Change Management process. Develop and optimize automated data ingestion / pipelines through Python/PySpark when APIs are available Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Identify and resolve complex data-related challenges Participate in sprint planning meetings and provide estimations on technical implementation. Basic Qualifications and Experience: Any degree with 5 - 9 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Knowledge of controlled vocabularies, classification, ontology and taxonomy Experience in ontology development using Progress Semaphore , or a similar tool like Pool Party etc Hands on experience writing SPARQL queries on graph data Excellent problem-solving skills and the ability to work with large, complex datasets Strong understanding of data modeling, data warehousing, and data integration concepts Good-to-Have Skills: Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc. ). Experience using cloud services such as AWS or Azure or GCP Experience working in Product Teams environment Knowledge of Python/R, Databricks, cloud data platforms Knowledge of NLP (Natural Language Processing) and AI (Artificial Intelligence) for extracting and standardizing controlled vocabularies. Strong understanding of data governance frameworks, tools, and best practices Professional Certifications : Databricks Certificate preferred , Progress Semaphore SAFe Practitioner Certificate preferred Any Data Analysis certification (SQL, Python) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. GCF4A .

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 15 Lacs

Mumbai, Bengaluru, Delhi

Work from Office

Experience : 3.00 + years Expected Notice Period : 7 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Must have skills required: Ecommerce, Healthcare Industry, EC2, Nextjs, HTML / CSS, JavaScript, MongoDB, Node Js, React Js, Redux UK's Top Healthcare Solutions Provider is Looking for: Role Description The Full Stack Engineer will be responsible for both front-end and back-end web development, participating in the full software development lifecycle. Day-to-day tasks will include creating and maintaining web applications, implementing new features, and ensuring the performance, quality, and responsiveness of applications. Collaboration with cross-functional teams to define, design, and ship new features will be essential. Tech-Stack: ReactJs, NextJs, NodeJs, MongoDB Experience: 3-5 years Proficient in HTML-CSS, JavaScript, MongoDB and AWS services. Experience with analyzing and solving complex technical issues. Knowledge of design patterns, data structure and algorithmic solutions. Knowledge and expertise in front-end frameworks and libraries such as React, Redux. Competent in branch management, code merging, and conflict resolution. Ability to write cross-browser-compatible code and troubleshoot any arising issues. Excellent knowledge of the NodeJS and its various supportive libraries and frameworks like expressJs. Experience building and maintaining of secure and scalable RESTful APIs. Strong knowledge of Design, implement, and maintain scalable MongoDB database architectures. Experience of MongoDB collections, indexes, and schemas to support application requirements.

Posted 1 month ago

Apply

4.0 - 8.0 years

0 - 0 Lacs

Chennai

Work from Office

Who we are looking for: The data engineering team's mission is to provide high availability and high resiliency as a core service to our ACV applications. The team is responsible for ETL’s using different ingestion and transformation techniques. We are responsible for a range of critical tasks aimed at ensuring smooth and efficient functioning and high availability of ACVs data platforms. We are a crucial bridge between Infrastructure Operations, Data Infrastructure, Analytics, and Development teams providing valuable feedback and insights to continuously improve platform reliability, functionality, and overall performance. We are seeking a talented data professional as a Data Engineer III to join our Data Engineering team. This role requires a strong focus and experience in software development, multi-cloud based technologies, in memory data stores, and a strong desire to learn complex systems and new technologies. It requires a sound foundation in database and infrastructure architecture, deep technical knowledge, software development, excellent communication skills, and an action-based philosophy to solve hard software engineering problems. What you will do: As a Data Engineer at ACV Auctions you HAVE FUN !! You will design, develop, write, and modify code. You will be responsible for development of ETLs, application architecture, optimizing databases & SQL queries. You will work alongside other data engineers and data scientists in the design and development of solutions to ACV’s most complex software problems. It is expected that you will be able to operate in a high performing team, that you can balance high quality delivery with customer focus, and that you will have a record of delivering and guiding team members in a fast-paced environment. Design, develop, and maintain scalable ETL pipelines using Python and SQL to ingest, process, and transform data from diverse sources. Write clean, efficient, and well-documented code in Python and SQL. Utilize Git for version control and collaborate effectively with other engineers. Implement and manage data orchestration workflows using industry-standard orchestration tools (e.g., Apache Airflow, Prefect).. Apply a strong understanding of major data structures (arrays, dictionaries, strings, trees, nodes, graphs, linked lists) to optimize data processing and storage. Support multi-cloud application development. Contribute, influence, and set standards for all technical aspects of a product or service including but not limited to, testing, debugging, performance, and languages. Support development stages for application development and data science teams, emphasizing in MySQL and Postgres database development. Influence company wide engineering standards for tooling, languages, and build systems. Leverage monitoring tools to ensure high performance and availability; work with operations and engineering to improve as required. Ensure that data development meets company standards for readability, reliability, and performance. Collaborate with internal teams on transactional and analytical schema design. Conduct code reviews, develop high-quality documentation, and build robust test suites Respond-to and troubleshoot highly complex problems quickly, efficiently, and effectively. Participate in engineering innovations including discovery of new technologies, implementation strategies, and architectural improvements. Participate in on-call rotation What you will need: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience) Ability to read, write, speak, and understand English. 3+ years of experience programming in Python 3+ years of experience with ETL workflow implementation (Airflow, Python) 3+ years work with continuous integration and build tools. 2+ year of experience with Cloud platforms preferably in AWS or GCP Knowledge of database architecture, infrastructure, performance tuning, and optimization techniques. Knowledge in day-day tools and how they work including deployments, k8s, monitoring systems, and testing tools. Proficient in version control systems including trunk-based development, multiple release planning, cherry picking, and rebase. Proficient in databases (RDB), SQL, and can contribute to table definitions. Self-sufficient debugger who can identify and solve complex problems in code. Deep understanding of major data structures (arrays, dictionaries, strings). Experience with Domain Driven Design. Experience with containers and Kubernetes. Experience with database monitoring and diagnostic tools, preferably Data Dog. Hands-on skills and the ability to drill deep into the complex system design and implementation. Proficiency in SQL query writing and optimization. Familiarity with database security principles and best practices. Familiarity with in-memory data processing Knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks Strong communication and collaboration skills, with the ability to work effectively in a fast paced global team environment. Experience working with: SQL data-layer development experience; OLTP schema design Using and integrating with cloud services, specifically: AWS RDS, Aurora, S3, GCP Github, Jenkins, Python Nice to Have Qualifications: Experience with Airflow, Docker, Visual Studio, Pycharm, Redis, Kubernetes, Fivetran, Spark, Dataflow, Dataproc, EMR Experience with database monitoring and diagnostic tools, preferably DataDog. Hands-on experience with Kafka or other event streaming technologies. Hands-on experience with micro-service architecture

Posted 1 month ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Mumbai

Work from Office

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. As a Data Engineer at Kyndryl, you'll be at the forefront of the data revolution, crafting and shaping data platforms that power our organization's success. This role is not just about code and databases; it's about transforming raw data into actionable insights that drive strategic decisions and innovation. An ELK(Elastic, Logstash & Kibana) Data Engineer is responsible for developing, implementing, and maintaining the ELK stack-based solutions within an organization. The engineer plays a crucial role in developing efficient and effective log processing, indexing, and visualization for monitoring, troubleshooting, and analysis purposes. In this role, you'll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you'll be the mastermind of data transformation. Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You'll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn't stop there. You'll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataset—a true data alchemist. Armed with a keen eye for detail, you'll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn't just a means to an end; it's the foundation upon which data-driven decisions are made – and your lifecycle management expertise will ensure our data remains fresh and impactful. So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience BS or MS degree in Computer Science or a related technical field 10+ years overall IT Industry Experience. 5+ years of Python or Java development experience 5+ years of SQL experience (No-SQL experience is a plus) 4+ years of experience with schema design and dimensional data modelling 3+ years of experience with Elastic, Logstash and Kibana Ability in managing and communicating data warehouse plans to internal clients. Experience designing, building, and maintaining data processing systems. Experience working with Machine Learning model is a plus. Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes) is a plus. Elastic Certification is preferrable. Preferred Skills and Experience • Experience working with Machine Learning model is a plus. • Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes) is a plus. • Elastic Certification is preferrable. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 1 month ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Description: We are seeking an experienced Data Architect to design and implement our enterprise data architecture, with a focus on Data Warehouse/Data Lake / Data Mesh. This position offers an opportunity to shape the data foundation of our organization, working with cutting-edge technologies and solving complex data challenges in a collaborative environment. Requirements: Bachelor s degree in computer science, Information Systems, or related field; Master s degree preferred 8+ years of experience in data architecture, database design, and data modelling. 5+ years of experience with cloud data platforms, particularly AWS data services 3+ years of experience architecting MPP database solutions (Redshift, Snowflake, etc.) Expert knowledge of data warehouse architecture and dimensional modelling Strong understanding of AWS data services ecosystem (Redshift, S3, Glue, DMS, Lambda) Experience with SQL Server and migration to cloud data platforms Proficiency in data modelling, entity relationship diagrams, and schema design Working knowledge of data integration patterns and technologies (ETL/ELT, CDC) Experience with one or more programming/scripting languages (Python, SQL, Shell) Familiarity with data lake architectures and technologies (Parquet, Delta Lake, Athena) Strong stakeholder management and influencing skills. Experience implementing data warehouse, data lake and data mesh architectures Good to have knowledge of machine learning workflows and feature engineering Understanding of regulatory requirements related to data (Fed Ramp, GDPR, CCPA, etc.) Experience with big data technologies (Spark, Hadoop). About Aurigo Aurigo is revolutionizing how the world plans, builds, and manages infrastructure projects with Masterworks , our industry-leading enterprise SaaS platform. Trusted by over 300 customers managing $300 billion in capital programs, Masterworks is setting new standards for project delivery and asset management. Recognized as one of the Top 25 AI Companies of 2024 and a Great Place to Work for three consecutive years, we are leveraging artificial intelligence to create a smarter, more connected future for customers in transportation, water and utilities, healthcare, higher education, and the government, with over 40,000 projects across North America. At Aurigo, we don t just develop software we shape the future. If you re excited to join a fast-growing company and collaborate with some of the brightest minds in the industry to solve real-world challenges, let s connect. Competencies

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Pune

Work from Office

We are looking for a Senior Data Engineer with deep hands-on expertise in PySpark, Databricks, and distributed data architecture. This individual will play a lead role in designing, developing, and optimizing data pipelines critical to our Ratings Modernization, Corrections, and Regulatory implementation programs under PDB 2.0. The ideal candidate will thrive in fast-paced, ambiguous environments and collaborate closely with engineering, product, and governance teams. Your Key Responsibilities Design, develop, and maintain robust ETL/ELT pipelines using PySpark and Databricks . Own pipeline architecture and drive performance improvements through partitioning, indexing, and Spark optimization . Collaborate with product owners, analysts, and other engineers to gather requirements and resolve complex data issues. Perform deep analysis and optimization of SQL queries , functions, and procedures for performance and scalability. Ensure high standards of data quality and reliability via robust validation and cleansing processes. Lead efforts in Delta Lake and cloud data warehouse architecture , including best practices for data lineage and schema management. Troubleshoot and resolve production incidents and pipeline failures quickly and thoroughly. Mentor junior team members and guide best practices across the team. Your skills and experience that will help you excel Bachelors degree in Computer Science, Engineering, or a related technical field. 6+ years of experience in data engineering or related roles. Advanced proficiency in Python, PySpark, and SQL . Strong experience with Databricks , BigQuery , and modern data lakehouse design. Hands-on knowledge of Azure or GCP data services. Proven experience in performance tuning and large-scale data processing . Strong communication skills and the ability to work independently in uncertain or evolving contexts About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose to power better investment decisions. You ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry.

Posted 1 month ago

Apply

6.0 - 11.0 years

5 - 8 Lacs

Pune

Work from Office

We are looking for a Senior Data Engineer with deep hands-on expertise in PySpark, Databricks, and distributed data architecture. This individual will play a lead role in designing, developing, and optimizing data pipelines critical to our Ratings Modernization, Corrections, and Regulatory implementation programs under PDB 2.0. The ideal candidate will thrive in fast-paced, ambiguous environments and collaborate closely with engineering, product, and governance teams. Your Key Responsibilities Design, develop, and maintain robust ETL/ELT pipelines using PySpark and Databricks . Own pipeline architecture and drive performance improvements through partitioning, indexing, and Spark optimization . Collaborate with product owners, analysts, and other engineers to gather requirements and resolve complex data issues. Perform deep analysis and optimization of SQL queries , functions, and procedures for performance and scalability. Ensure high standards of data quality and reliability via robust validation and cleansing processes. Lead efforts in Delta Lake and cloud data warehouse architecture , including best practices for data lineage and schema management. Troubleshoot and resolve production incidents and pipeline failures quickly and thoroughly. Mentor junior team members and guide best practices across the team. Your skills and experience that will help you excel Bachelors degree in Computer Science, Engineering, or a related technical field. 6+ years of experience in data engineering or related roles. Advanced proficiency in Python, PySpark, and SQL . Strong experience with Databricks , BigQuery , and modern data lakehouse design. Hands-on knowledge of Azure or GCP data services. Proven experience in performance tuning and large-scale data processing . Strong communication skills and the ability to work independently in uncertain or evolving contexts About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women s Leadership Forum. . MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for . Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies . Note on recruitment scams

Posted 1 month ago

Apply

3.0 - 5.0 years

7 - 11 Lacs

Ahmedabad

Work from Office

Hey There At Saleshandy, were building the Cold Email Outreach platform of the future. Were building a product toward eliminating manual processes and helping companies generate more replies/book more meetings / generate leads (faster). Since our founding in 2016, weve grown to become a profitable, 100% geographically dispersed team of 65+ high-performing happy people who are dedicated to building a product that our customers love. What s the Role About Ever wondered how Saleshandy schedules millions of emails and still feels lightning-fastBehind that magic is performance engineering. We re hiring a Performance Engineer who thrives on making systems faster, leaner, and more reliable across backend, frontend, and infrastructure. Your mission: eliminate latency, fix CPU/memory bottlenecks, optimize queries, tame queues, and guide teams to build with performance in mind. This isn t just about fire-fighting, it s about owning speed as a product feature. You ll work across the stack and use deep diagnostics, smart tooling, and system intuition to make things fly. Why Join Us 1. Purpose: Your work will directly impact page speeds, email throughput, scale. At Saleshandy, performance isn t a luxury, it s part of our premium promise. 2. Growth: You ll operate across multiple teams and tech layers, Node.js, MySQL, Redis, React, Kafka, ClickHouse, AWS, with the freedom to shape how we build fast systems. 3. Motivation: If you ve ever celebrated shaving 500ms off a page load, or chased a memory leak across 3 services just for fun, this is your home. We celebrate engineers who care about P99s, flamegraphs, and cache hits. Your Main Goals 1. Identify and Eliminate Backend Bottlenecks (within 90 days) Run deep diagnostics using Clinic.js, heap snapshots, GC logs, and flamegraphs. Tackle high CPU/memory usage, event loop stalls, and async call inefficiencies in Node.js. Goal: Cut backend P95 response times by 30-40% for key APIs. 2. Optimize MySQL Query Performance & Configuration (within 60 days) Use slow query logs, EXPLAIN, Percona Toolkit, and indexing strategies to tune queries and schema. Tune server-level configs like innodb_buffer_pool_size. Target: Eliminate top 10 slow queries and reduce DB CPU usage by 25%. 3. Improve Frontend Performance & Load Time (within 90 days) Audit key frontend flows using Lighthouse, Core Web Vitals, asset audits. Drive improvements via lazy loading, tree-shaking, and code splitting. Goal: Get homepage and dashboard load times under 1.5s for 95% users. 4. Make Infra & Monitoring Observability-First (within 120 days) Set up meaningful alerts and dashboards using Grafana, Loki, Tempo, Prometheus. Lead infra-level debugging thread stalls, IO throttling, network latency. Goal: Reduce time-to-detect and time-to-resolve for perf issues by 50%. Important Tasks 1. First 30 Days - System Performance Audit Do a full audit of backend, DB, infra, and frontend performance. Identify critical pain points and quick wins. 2. Debug a Live Performance Incident Catch and resolve a real-world performance regression. Could be Node.js memory leak, a slow MySQL join, or Redis job congestion. Share a full RCA and fix. 3. Create and Share Performance Playbooks (by Day 45) Build SOPs for slow query debugging, frontend perf checks, Redis TTL fixes, or Node.js memory leaks. Turn performance tuning into team sport. 4. Guide Teams on Performance-Aware Development (within 90 days) Create internal micro-trainings or async reviews to help devs write faster APIs, reduce DB load, and spot regressions earlier. 5. Use AI or Smart Tooling in Diagnostics Try out tools like Copilot for test coverage, or use AI-powered observability tools (e.g. Datadog AI, Loki queries, etc.) to accelerate diagnostics. 6. Build Flamegraph/Profiling Baselines Set up and maintain performance profiling baselines (using Clinic.js, 0x, etc.) so regressions can be caught before they ship. 7. Review Queues and Caching Layer Identify performance issues in Redis queues retries, TTL delays, locking and tune caching strategies across app and DB. 8. Contribute to Performance Culture Encourage tracking of real metrics: TTI, DB query time, API P95s. Collaborate with product and engineering to define what fast enough means. Experience Level: 3-5 years Tech Stack: Node.js, MySQL, Redis, Grafana, Prometheus, Clinic.js, Percona Toolkit Culture Fit - Are You One of Us Were a fast-moving, globally distributed SaaS team where speed matters not just in product, but in how we work. We believe in ownership, system thinking, and real accountability. If you like solving hard problems, value simplicity, and hate regressions, you ll thrive here.

Posted 1 month ago

Apply

2.0 - 5.0 years

3 - 7 Lacs

Ahmedabad

Work from Office

We are seeking a talented and creative UI Designer to join our growing design team. The ideal candidate will have a strong passion for user-centered design and will be responsible for crafting seamless, visually appealing, and user-friendly interfaces for our products. As a UI Designer at Datanova, you will collaborate closely with product managers, developers, and other stakeholders to create exceptional user experiences. Key Responsibilities: Design and develop visually appealing, easy-to-use user interfaces for web and mobile applications. Work collaboratively with product managers, UX designers, and developers to create consistent and user-centered designs. Create wireframes, mockups, prototypes, and high-fidelity designs to communicate design concepts clearly and effectively. Ensure that the designs align with the company s branding guidelines and design standards. Perform UI design tasks, including button and icon design, typography, color schemes, layout design, and visual assets. Continuously improve the user interface based on user feedback and analytics. Stay up-to-date with the latest design trends, technologies, and best practices. Maintain a consistent design language and visual style across various platforms and devices. Conduct usability testing and participate in user research to validate design decisions. Collaborate in creating and maintaining a design system that can scale with the product. Optimize designs for performance, accessibility, and responsiveness. Requirements Bachelors degree in Graphic Design, Interaction Design, or a related field.Proven experience as a UI Designer with a strong portfolio demonstrating design expertise in web and mobile applications.Proficiency in design tools such as Adobe XD, Sketch, Figma, and other relevant software.Understanding of design principles, layout, typography, and color theory.Ability to translate complex requirements into clean, user-friendly designs.Strong communication and collaboration skills with the ability to work in a cross-functional team.Experience with front-end development (HTML/CSS) is a plus.Familiarity with UI patterns, design systems, and user-centered design principles. Benefits Hybrid Working Model Updated: 29 minutes ago

Posted 1 month ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Noida

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: .NET developer along with SQL skill Engagement & project OGA Fin IT-Opx team works to develop applications for Finance and Quality team. Each developer works with end user to get requirement and develop scalable application Resources to analyze business requirement and build tool using .Net core and SQL server Design and develop new automations basis the requirement Development, support, maintenance, and implementation of a complex project module Design, code, test, debug, and record those programs proficient to work at top technical level of every level of applications programming activities Evaluate, design, code and test improvement as required to complex modules Design and develop code applications to technical and functional programming standards Writing and testing scripts, debugging programs and integrating applications with third-party web services. To be successful in this role, you should have experience using server-side logic and work well in a team Views, Stored Procedure, Functions, Triggers Be able to fix jobs due to logic/code issues and able to do jobs modifications based on the requirements Analyze the code, understand the purpose, modify the code to run successfully Apply data modeling techniques to ensure development and implementation support efforts meet integration and performance expectations Independently analyze, solve, and correct issues in real time, providing problem resolution end-to-end. Refine and automate regular processes, track issues, and document changes Assist developers with complex query tuning and schema refinement Provide support for critical production systems Share domain and technical expertise, providing technical mentorship and cross-training to other peers and team members Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Graduate 3+ years of experience on full stack application development on .Net core and SQL server Working experience on MVC Design patterns Experience in .NET Core, MVC, ASP.Net, CSS, Javascript, Jquery, Web Api, SQL (Microsoft SQL Server), Azure Proven solid programming skills in .NET core, MVC and SQL Server & .Net. Experience in .net maintenance and migration projects Proven expertise debugging capabilities and resolving the issues while executing the code and automate the process Proven ability to handle multiple priorities and deadlines effectively and efficiently Preferred Qualification: Experience in React At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #njp #SSTECH #SSF&A External Candidate Application Internal Employee Application

Posted 1 month ago

Apply

5.0 - 8.0 years

6 - 11 Lacs

Mumbai

Work from Office

Develop SAP HR/HCM applications with ABAP. Support PA/OM data handling, payroll schema, and HR interfaces. Experience in ECC/HANA environments, LSMW, IDocs, and SmartForms is essential. Must handle sensitive data securely and collaborate with HR teams.

Posted 1 month ago

Apply

5.0 - 8.0 years

6 - 11 Lacs

Surat

Work from Office

Develop SAP HR/HCM applications with ABAP. Support PA/OM data handling, payroll schema, and HR interfaces. Experience in ECC/HANA environments, LSMW, IDocs, and SmartForms is essential. Must handle sensitive data securely and collaborate with HR teams.

Posted 1 month ago

Apply

5.0 - 8.0 years

6 - 11 Lacs

Jaipur

Work from Office

Develop SAP HR/HCM applications with ABAP. Support PA/OM data handling, payroll schema, and HR interfaces. Experience in ECC/HANA environments, LSMW, IDocs, and SmartForms is essential. Must handle sensitive data securely and collaborate with HR teams.

Posted 1 month ago

Apply

5.0 - 8.0 years

6 - 11 Lacs

Ludhiana

Work from Office

Develop SAP HR/HCM applications with ABAP. Support PA/OM data handling, payroll schema, and HR interfaces. Experience in ECC/HANA environments, LSMW, IDocs, and SmartForms is essential. Must handle sensitive data securely and collaborate with HR teams.

Posted 1 month ago

Apply

5.0 - 8.0 years

6 - 11 Lacs

Pune

Work from Office

Develop SAP HR/HCM applications with ABAP. Support PA/OM data handling, payroll schema, and HR interfaces. Experience in ECC/HANA environments, LSMW, IDocs, and SmartForms is essential. Must handle sensitive data securely and collaborate with HR teams.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies