Jobs
Interviews

901 Schema Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 13.0 years

6 - 10 Lacs

Gurugram, Bengaluru

Work from Office

Design, implement, and maintain Elasticsearch clusters to support large-scale search applications. Develop, optimize, and maintain custom search queries, aggregations, and indexing strategies . Work with data pipelines , including ingestion, transformation, and storage of structured and unstructured data. Integrate Elasticsearch with web applications, APIs, and other data storage systems . Implement scalability, performance tuning, and security best practices for Elasticsearch clusters. Troubleshoot search performance issues and enhance the relevance and efficiency of search results. Work with Kibana , Logstash, and Beats for visualization and data analysis. Collaborate with developers, data engineers, and DevOps teams to deploy and maintain search infrastructure. Stay updated on the latest Elasticsearch features, plugins, and best practices. Required Skills & Qualifications: Strong experience with Elasticsearch (versions 7.x/8.x) and related tools (Kibana, Logstash, Beats). Proficiency in writing complex Elasticsearch queries, aggregations, and analyzers . Experience with full-text search, relevance tuning, and ranking algorithms . Knowledge of indexing, mapping, and schema design for optimal search performance. Proficiency in Python, Java, or Node.js for developing search applications. Experience with RESTful APIs and integrating Elasticsearch with various platforms. Familiarity with distributed systems, clustering, and high-availability configurations . Hands-on experience with Docker, Kubernetes, and cloud platforms (AWS, Azure, GCP) is a plus. Strong problem-solving skills and ability to troubleshoot performance bottlenecks.

Posted 1 month ago

Apply

2.0 - 4.0 years

14 - 18 Lacs

Bengaluru

Work from Office

At HackerRank, we are on a mission to change the world to value skills over pedigree . We are a high-performing, mission-driven team that truly, madly, deeply cares about what we do. We don t see velocity and quality as tradeoffs; both matter. If you take pride in high-impact work and thrive in a driven team, HackerRank is where you belong. About the team: Our BI & analytics team is driven by a clear mission to provide actionable insights for business growth. Recent Achievements: The team has powered the 2025H2 planning cycle centered on well-researched and relevant product adoption and revenue attribution metrics. Further, the GTM analytics team has made significant improvements in Revenue Reporting, Customer Journey, and other root cause analysis for the Self-serve channel, which was previously underserved analytically. Collaboration Style: Collaboration is at the heart of how we work. We seamlessly balance synchronous and asynchronous methods, enabling us to work cohesively as a team while respecting individual workflows. This approach fosters efficiency and inclusivity in tackling tasks together. About the role: As an Analytics Engineer, you will play a pivotal role in transforming raw data into actionable insights by building scalable data models and ensuring robust governance practices. You will collaborate with cross-functional teams to deliver high-quality datasets while supporting data governance initiatives such as maintaining data dictionaries, tracking lineage, and managing changes effectively. What you ll do: Data Engineering & Modeling - Build and maintain scalable data models to transform raw data into analytics-ready datasets. - Develop reusable SQL queries and modular pipelines using tools like dbt (Data Build Tool). - Optimize database schema designs for performance and maintainability. Data Governance - Create and maintain comprehensive data dictionaries with consistent naming conventions. - Track data lineage to document how data flows through systems and assess the impact of changes. - Implement change management protocols for updates to data models or pipelines, ensuring proper testing and communication with stakeholders. Collaboration & Stakeholder Management - Partner with the Analytics team to understand requirements and translate them into technical solutions. - Collaborate with Engineering teams on instrumentation and tracking improvements. - Provide documentation and training on new datasets or processes to enable self-service analytics capabilities You will thrive in this role if: Technical Skills - Strong proficiency in SQL for writing scalable queries and advanced transformations (e.g., window functions). - Hands-on experience with dbt for managing data transformations and testing frameworks. - Proficiency in Python for automating workflows and managing dependencies. - Familiarity with modern data warehouse platforms (e.g., Snowflake, BigQuery). Governance Expertise - Experience maintaining data dictionaries and establishing consistent documentation practices. - Knowledge of tools/processes for tracking data lineage across systems. - Proven ability to implement and manage change management protocols for datasets or pipelines. Soft Skills - Strong communication skills to translate technical concepts into business-friendly language. - Ability to work collaboratively in cross-functional teams while managing competing priorities. - Problem-solving mindset with a focus on delivering business-relevant insights. What you bring: 2-4 years of experience in analytics engineering or a related field (data engineering or BI development). Solid understanding of data modeling principles and best practices for analytics use cases. Experience working in collaborative coding environments using Git-based workflows (e.g., code reviews, CI/CD pipelines) Experience with Spark, Spark Structured Streaming (Scala Spark) Experience with database technologies like Redshift or Trino Experience with ETL Design & Orchestration using platforms like Apache Airflow, MageAI etc is a big plus Experience querying massive datasets using Languages like SQL, Hive, Spark, Trino Experience with performance tuning complex data warehouses and queries. Able to solve problems of scale, performance, security, and reliability Self-driven, an initiative taker with good communication skills, and work with cross-functional teams Want to learn more about HackerRank? Check out HackerRank.com to explore our products, solutions and resources, and dive into our story and mission here . HackerRank is a proud equal employment opportunity and affirmative action employer. We provide equal opportunity to everyone for employment based on individual performance and qualification. We never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. Notice to prospective HackerRank job applicants: Our Recruiters use @hackerrank.com email addresses. We never ask for payment or credit check information to apply, interview, or work here.

Posted 1 month ago

Apply

8.0 - 13.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Relevant and Total years of experience Relevant 8+ years and Total 10+ years Detailed job description - Skill Set: Bachelors or masters degree in computer science, Engineering, or a related field. 10+ years of overall experience and 8+ years of relevant in Data bricks, DLT, Py spark and Data modelling concepts-Dimensional Data Modelling (Star Schema, Snowflake Schema) Proficiency in programming languages such as Python, Py spark, Scala, SQL. Proficiency in DLT Proficiency in SQL Proficiency in Data Modelling concepts - Dimensional Data Modelling (Star Schema, Snowflake Schema) Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. Proven track record of delivering scalable and reliable data solutions in a fast-paced environment. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills with the ability to work effectively in cross-functional teams. Good to have experience with containerization technologies such as Docker and Kubernetes. Knowledge of DevOps practices for automated deployment and monitoring of data pipelines. Mandatory Skills Primary Skills -> Data bricks, DLT, Py spark, Data Modelling concepts [Dimensional Data Modelling (Star Schema, Snowflake Schema)] & SQL

Posted 1 month ago

Apply

2.0 - 5.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Who are we and What do we do?ShareChat (https://sharechat.com/about) is Indias largest homegrown social media company, with 325+million monthly active users across all its platforms including Moj, a leading short video app which was launched in a record 30 hours.Founded in October 2015, with a vision of building an inclusive community that encourages & empowers each individual to share their unique journey and valuable experiences with confidence. We are spearheading Indias internet revolution by building products through world-class AI & tech to evangelize the content ecosystem for India in regional languages.We believe in complete ownership of problem-solving while committing to speed and integrity ineverything we do. We place the utmost importance on user empathy & strive to work towards creating a world-class experience for them every day. Join us to drive how the next billion users will interact on the internet!What You ll Do? * Develop and own the SEO strategy for ShareChat s website to drive traffic, engagement, and visibility. * Stay updated with the latest Google algorithm changes and adapt strategies accordingly. * Conduct on-page, off-page, and technical SEO audits regularly and execute optimization initiatives. * Perform keyword research, content gap analysis, and optimize content based on search trends and user intent. * Collaborate with content, design, and product teams to align SEO initiatives with user experience. * Build and manage SEO dashboards using Google Analytics, Search Console, and other relevant tools. * Track performance, analyze metrics, and present actionable insights and reports to internal stakeholders. * Identify and work with freelancers or in-house writers/designers when needed to implement SEO plans. * Take hands-on, end-to-end ownership from strategy to execution and light team/project management.Who are you? * 2-5 years of hands-on SEO experience, preferably in consumer internet or product-based companies. * Deep understanding of Google s evolving SEO landscape, including EEAT, Core Web Vitals, Indexing APIs, AI driven searches and helpful content updates. * Proficient with SEO tools: Google Search Console, GA4, SEMrush, Ahrefs, Screaming Frog, etc. * Strong analytical mindset with the ability to derive insights from data. * Comfortable with basic HTML, schema markup, and technical SEO fundamentals. * A self-starter who enjoys working independently and owns outcomes. * Bonus if you ve previously managed SEO for a high-traffic website or regional content platforms.

Posted 1 month ago

Apply

2.0 - 6.0 years

9 - 14 Lacs

Gurugram

Work from Office

Is that you? At least 1 year in a customer-facing technical role (e.g., solutions engineer, implementation specialist, pre-sales/solution consultant) you ve worked directly with prospects or customers to understand their requirements, propose integration approaches, lead technical discussions, and validate solutions in real-world conditions. At least 1 year as a developer you ve personally built and shipped production-grade API integrations or backend features that went live Solid grasp of integration fundamentals hands-on experience with REST, JSON, OAuth 2.0, and webhooks; Familiarity with API schema formats like OpenAPI, WSDL is expected. Proficient in Postman and one scripting language (Python or JavaScript/TypeScript) able to create test harnesses, mock APIs, or automate validation flows. Clarity in communication write PRDs that engineers can implement without back-and-forth, draft clear and concise emails to customers and their IT teams, and document decisions rigorously. Structured thinking break down complex problems into logical steps, clearly map workflows across systems, and prioritize effectively. Ownership mindset you chase blockers, drive timelines, and push features to the finish line without waiting for hand-holding. Cross-functional collaboration proactively align with engineering, sales, and customer teams to gather requirements, assign technical action items, and maintain accountability through project milestones What happens after you apply? Step 1 : Within 15 days of your application - which is wholesome, original & expressive - our People Team will reach out to you for a quick chat. Step 2: Within 4-6 days of chatting with the People Team, you will get a call from someone from your future team to discuss the job role. Step 3: If all goes well, we ll schedule a call with your future manager to deep dive into the role with you and for you to show off your skills through a small task. Step 4: After a quick interaction with the People Team, If our vibes match, a t te- -t te with the inFeedos leadership team follows If we mutually enjoy the 4 steps, we onboard you with a big smile :)

Posted 1 month ago

Apply

7.0 - 14.0 years

20 - 25 Lacs

Bengaluru

Work from Office

We enable #HumanFirstDigital Strong experience with GraphQL schema design and query optimization Proficient in Apollo Server/Client or similar GraphQL frameworks Backend experience with Node.js, Python (Django/Flask/FastAPI), or Go Strong REST and GraphQL API knowledge and security best practices Experience in caching strategies, query batching, and pagination Exposure to performance profiling and monitoring GraphQL endpoints Familiarity with CI/CD tools and containerized deployments

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Backend Engineer: MagellanCx Expectations We are looking for a backend engineer who can fully own the backend architecture of our platform; from the API layer to database modeling and cloud deployment. You will be responsible for maintaining and evolving all backend services and cloud infrastructure, including replacing the current limited .NET Core implementation with a modern, maintainable stack. Currently, we expose a few REST endpoints, but over time we plan to introduce a GraphQL API layer, which you will help design and implement. This is a hands-on, solo contributor role to start. Were looking to build a flexible, modern backend stack; ideally using Node.js (TypeScript) or Golang. You will work closely with a dedicated frontend team and report directly to the technical leadership. Responsibilities Migrate existing REST API endpoints from .NET Core to Node.js (NestJS) or Golang backend microservices Architect modular, scalable backend services using clean architecture or domain-driven design principles Implement backend APIs that power a multi-tenant SaaS product , including: o Multi-tenant logic for data isolation, scoped permissions, and tenant-aware behaviors o Authentication and authorization middleware o Role-based access control for 3 system roles Design and implement a GraphQL API layer (schema-first, data loaders, modular resolvers) Implement observability patterns (structured logging, tracing, error monitoring) using tools like OpenTelemetry , Sentry , etc. Manage the KrakenD API Gateway , including: o Creating and updating route configurations o Integrating new services via request transformation Build and publish Docker containers to AWS ECR ; deploy to ECS (Fargate or EC2) Configure and maintain GitHub Actions for CI/CD pipelines Design and manage cloud infrastructure networking: o Application Load Balancer (ALB) setup o Private networking for service-to-service communication o Planning for future service mesh adoption Design and evolve PostgreSQL database schemas Manage schema changes using code-based migration tools (not GUI tools) Collaborate with frontend and platform engineers to ensure seamless integration Qualifications 5+ years of backend development experience with strong architectural decision-making Proficient in TypeScript/Node.js (NestJS preferred) or Golang Strong knowledge of GraphQL , including: o Unified schema design o Resolver architecture o DataLoader patterns and performance tuning Deep experience building and maintaining REST APIs Hands-on experience with: o KrakenD API Gateway o AWS services : ECR, ECS, ALB, and VPC networking o Docker and container orchestration o GitHub Actions CI/CD automation Fluent in PostgreSQL : schema design, indexing, migrations, and tuning Familiar with secure service-to-service networking and early-stage service mesh planning Comfortable working independently and proactively in a distributed team environment Clear communicator, highly accountable, and detail-oriented Nice to Have Experience replacing legacy backends or leading replatforming efforts Familiarity with GraphQL federation patterns Exposure to Infrastructure-as-Code tools (e.g., Terraform, AWS CDK) Prior experience in a startup or high-ownership engineering environment ",

Posted 1 month ago

Apply

10.0 - 15.0 years

20 - 25 Lacs

Hyderabad

Work from Office

About the Lead Data Engineer Role As a Data Engineer with ETL/ELT background, the candidate needs to design and develop reusable data ingestion processes from variety of sources and build data pipelines for Synapse Azure cloud data warehouse platform and reporting processes. Responsibilities Design, develop & implement ETL processes on Azure Cloud using Databricks and Azure Synapse Advanced SQL knowledge, capable to write optimized queries for faster data work flows. Must be extremely well versed with handling large volume data and work using different tools to derive the required solution. Work with offshore team, business analysts and other data engineering teams to ensure alignment of requirements, methodologies and best practices Requirements Bachelors Degree or master s degree in Computer Science. 10+ years of hands-on software engineering experience. Proven work experience in Spark , Python ,SQL , Any RDBMS. Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS Strong database fundamentals including SQL, performance and schema design. Understanding of CI/CD framework is needed. Knowledge on DBT , Snowflake and AWS is an added advantage Ability to interpret/write custom shell scripts. Python scripting is a plus. Experience with Azure platform and Synapse Experience with Git / Azure DevOps To be able to work in a fast-paced agile development environment.

Posted 1 month ago

Apply

4.0 - 5.0 years

9 - 10 Lacs

Bengaluru

Work from Office

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Snowflake Professionals in the following areas : JD for Senior Snowflake developer as below Snowflake SnowSQL, PL/SQL Any ETL Tool Job Description: 4-5 years of IT experience in Analysis, Design, Development and unit testing of Data warehousing applications using industry accepted methodologies and procedures. Write complex SQL queries to implement ETL (Extract, Transform, Load) processes and for Business Intelligence reporting. Strong experience with Snowpipe execustion, snowflake Datawarehouse, deep understanding of snowflake architecture and Processing, Creating and managing automated data pipelines for both batch and streaming data using DBT. Designing and implementing data models and schemas to support data warehousing and analytics within Snowflake. Writing and optimizing SQL queries for efficient data retrieval and analysis. Deliver robust solutions through Query optimization ensuring Data Quality. Should have experience in writing Functions and Stored Procedures. Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modelling Analyse & translate functional specifications /user stories into technical specifications. Good to have experience in Design/ Development in any ETL tool. Good interpersonal skills, experience in handling communication and interactions between different teams. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 1 month ago

Apply

2.0 - 5.0 years

5 - 8 Lacs

Chennai

Work from Office

The opportunity: Deliver product documentation in line with organizational needs (R&D, Product Support, Service, Customers, Legislative bodies. ) How you ll make an impact : Experience in Mechanical, Heavy Engineering, Rail, Industrial Engineering, Automobile or Aerospace domains. Develop comprehensive technical manuals for Mechanical and Electrical products that comply with international standards. Maintenance of existing technical documentation based on technical inputs. Ability to explain complex technical issues in an easily understandable way for any target audience. Analyze the input from SME & Engineering data to create and update the maintenance documents using technical writing standards. Ability to collaborate with SMEs in a global work environment. Utilize the 2D drawings, 3D CAD data, and graphic options to visually represent the step-by-step installation and service procedures for specific parts and assemblies. Create state-of-the-art illustrations, including exploded views in isometric angles using Illustration tools. Standardize content and document types using a Content Management system. Responsible to ensure compliance with applicable procedures, and guidelines external and internal regulations. Your background: Minimum 2-5 years of experience in creating/editing technical Manuals (Technical Writer/Illustrator). Must be able to collaborate with SMEs to prioritize and manage multiple projects in a fast-paced, challenging environment. Must be able to read and interpret Engineering drawings. Strong Knowledge of the end-to-end documentation creation process. Working knowledge with Product Lifecycle Management systems (for example Windchill/PDM Link) and Document Management Systems (for example Teamcenter, xECM). Proficient in Illustration tools like Adobe Illustrator, Arbortext IsoDraw, Creo Illustrate, or similar. Working Knowledge of Schema ST4 or other content management system is preferred. Experience in web design, HTML, CSS is preferred. Knowledge on virtual and augmented reality-based documentation is preferred. Qualified individuals with a disability may request a reasonable accommodation if you are unable or limited in your ability to use or access the Hitachi Energy career site as a result of your disability. You may request reasonable accommodations by completing a general inquiry form on our website. Please include your contact information and specific details about your required accommodation to support you during the job application process. .

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Mohali

Work from Office

We are looking for a Snowflake Developer with 5+ years of experience in Snowflake Data Warehouse and related tools. You will build, manage, and optimize data pipelines, assist in data integration, and contribute to data architecture. The ideal candidate should understand data modeling and ETL processes, and have experience with cloud-based data platforms. Please confirm once you ve gained access, and let us know if you need further assistance. Key Responsibilities Design, develop, and maintain Snowflake Data Warehouses. Create and manage Snowflake schema, tables, views, and materialized views. Implement ETL processes to integrate data from various sources into Snowflake. Collaborate with Data Engineers, Data Scientists, and Analysts to build efficient data pipelines. Ensure data integrity, security, and compliance with data governance policies. Requirements Proficient in SQL, SnowSQL, and ETL processes Strong experience in data modeling and schema design in Snowflake. Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data pipelines, data lakes, and data integration tools. Experience in using tools like dbt, Airflow, or similar orchestration tools is a plus. Maintain records of the conversations with the customer and analyze the data. Handling customer queries on Chat and E-mails. Lorem Ipsum Work with us SourceMash Technologies is a leading solution provider for internet-based applications and product development since 2008. Be a part of our company that is facilitated by highly skilled professionals dedicated to providing total IT solutions under one roof. We offer remarkable services in the areas of Software Development, Quality Assurance, and Support. An employee welcome kit, like Custom Notepad, T-Shirt, Water Bottle etc., is also included in employee welcome packages onboard. SourceMash Technologies offers the best employee health insurance benefit to their employees family members under the same policy. Annual leaves are paid at the payment rate in the working period before the leave, and no untaken leaves can be considered part of the mandatory notice periods.

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Coimbatore

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain data pipelines.- Ensure data quality throughout the data lifecycle.- Implement ETL processes for data migration and deployment.- Collaborate with cross-functional teams to understand data requirements.- Optimize data storage and retrieval processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data engineering principles.- Experience with cloud-based data services.- Knowledge of SQL and database management systems.- Hands-on experience with data modeling and schema design. Additional Information:- The candidate should have a minimum of 3 years of experience in Google BigQuery.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP HCM Personnel Administration Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business needs and technical specifications. Your role will require you to facilitate communication between stakeholders and the development team, ensuring that all parties are informed and engaged throughout the project lifecycle. Additionally, you will be responsible for monitoring project progress and making necessary adjustments to meet deadlines and quality standards. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and knowledge sharing sessions to enhance team capabilities.- Monitor project timelines and deliverables to ensure alignment with business objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM Personnel Administration.- Good To Have Skills: Experience with SAP SuccessFactors.- Strong understanding of application design and development methodologies.- Experience in project management tools and techniques.- Ability to analyze and troubleshoot application issues effectively. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP HCM Personnel Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP Native HANA SQL Modeling & Development Good to have skills : Talend ETL, SAP BusinessObjects Data ServicesMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular team meetings to discuss progress and challenges- Stay updated on industry trends and technologies to enhance team performance Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Native HANA SQL Modeling & Development- Good To Have Skills: Experience with Talend ETL, SAP BusinessObjects Data Services- Strong understanding of database management and optimization- Expertise in data modeling and schema design- Hands-on experience with SAP HANA Studio and SAP HANA Cloud Platform- Knowledge of data integration and data warehousing concepts Additional Information:- The candidate should have a minimum of 5 years of experience in SAP Native HANA SQL Modeling & Development- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

1.0 - 5.0 years

8 - 12 Lacs

Pune

Work from Office

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Moon#169 - Senior Data Engineer Who is Mastercard Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Overview: Ethoca, a Mastercard Company is seeking a Senior Data Engineer to join our team in Pune, India to drive data enablement and explore big data solutions within our technology landscape. The role is visible and critical as part of a high performing team - it will appeal to you if you have an effective combination of domain knowledge, relevant experience and the ability to execute on the details. You will bring cutting edge software and full stack development skills with advanced knowledge of cloud and data lake experience while working with massive data volumes. You will own this - our teams are small, agile and focused on the needs of the high growth fintech marketplace. You will be working across functional teams within Ethoca and Mastercard to deliver on cloud strategy. We are committed in making our systems resilient and responsive yet easily maintainable on cloud. Key Responsibilities: Design, develop, and optimize batch and real-time data pipelines using Snowflake, Snowpark, Python, and PySpark. Build data transformation workflows using dbt, with a strong focus on Test-Driven Development (TDD) and modular design. Implement and manage CI/CD pipelines using GitLab and Jenkins, enabling automated testing, deployment, and monitoring of data workflows. Deploy and manage Snowflake objects using Schema Change, ensuring controlled, auditable, and repeatable releases across environments. Administer and optimize the Snowflake platform, handling performance tuning, access management, cost control, and platform scalability. Drive DataOps practices by integrating testing, monitoring, versioning, and collaboration into every phase of the data pipeline lifecycle. Build scalable and reusable data models that support business analytics and dashboarding in Power BI. Develop and support real-time data streaming pipelines (e.g., using Kafka, Spark Structured Streaming) for near-instant data availability. Establish and implement data observability practices, including monitoring data quality, freshness, lineage, and anomaly detection across the platform. Plan and own deployments, migrations, and upgrades across data platforms and pipelines to minimize service impacts, including developing and executing mitigation plans. Collaborate with stakeholders to understand data requirements and deliver reliable, high-impact data solutions. Document pipeline architecture, processes, and standards, promoting consistency and transparency across the team. Apply exceptional problem-solving and analytical skills to troubleshoot complex data and system issues. Demonstrate excellent written and verbal communication skills when collaborating across technical and non-technical teams. Required Qualifications: Tenured in the fields of Computer Science/Engineering or Software Engineering. Bachelors degree in computer science, or a related technical field including programming. Deep hands-on experience with Snowflake (including administration), Snowpark, and Python. Strong background in PySpark and distributed data processing. Proven track record using dbt for building robust, testable data transformation workflows following TDD. Familiarity with Schema Change for Snowflake object deployment and version control. Proficient in CI/CD tooling, especially GitLab and Jenkins, with a focus on automation and DataOps. Experience with real-time data processing and streaming pipelines. Strong grasp of cloud-based database infrastructure (AWS, Azure, or GCP). Skilled in developing insightful dashboards and scalable data models using Power BI. Expert in SQL development and performance optimization. Demonstrated success in building and maintaining data observability tools and frameworks. Proven ability to plan and execute deployments, upgrades, and migrations with minimal disruption to operations. Strong communication, collaboration, and analytical thinking across technical and non-technical stakeholders. Ideally you have experience in banking, e-commerce, credit cards or payment processing and exposure to both SaaS and premises-based architectures. In addition, you have a post-secondary degree in computer science, mathematics, or quantitative science. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard s guidelines.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 9 Lacs

Mumbai

Work from Office

This role is eligible for our hybrid work model: Two days in-office . Why this job s a big deal: Enjoy working with big dataWe do too! In fact, data lies at the very core of our business! As a data engineer you will be working on data systems that serve and generate billions of events each day. Being part of the Data Services team and working with a data warehouse you will leverage data to understand, meet, and anticipate the needs of finance business users. In this role you will get to: Participate in cloud migration and modernizing current data systems. Oversee data systems that serve millions of real time events a day. Maintain and design early warning and analytics systems using cutting edge big data technologies. Collaborate on building software that collects and queries data, while also composing queries for investigation purposes. Analyze and provide data supported recommendations to improve product performance and customer acquisition. Diagnose and troubleshoot any issues within data infrastructure. Effectively collaborate and engage in team efforts, mentoring the junior members in the team, speak up for what you think are the best solutions and be able to converse respectfully and compromise when necessary. Who you are: Bachelor s degree in Computer Science or a related field. Minimum of 5 years of experience in software engineering and development, with a strong focus on Big Data technologies and data-driven solutions. Demonstrated expertise in Big Data solutions, including frameworks such as Apache Spark and Kafka. Proficiency in the Google Cloud Platform (GCP) stack is highly desirable. Advanced knowledge in data modeling and schema design for optimized data storage and retrieval. Strong command of databases, including CloudSQL (MySQL, PostgreSQL), BigQuery, BigTable, and Oracle. Extensive experience with GCP tools, specifically Dataflow, Dataproc and datastream for building and managing large-scal e data processing pipelines. Proficiency in data orchestration frameworks, particularly Apache Airflow, for scheduling and managing complex workflows. Proven ability to analyze, debug, and resolve data processing issues efficiently. Proficient in Python and/or Java, Unix scripting with the ability to write efficient, maintainable code in either language. Experience with data visualization tools, such as Tableau, Looker for representing data insights. Familiarity with advanced data processing strategies, including indexing, machine-learned ranking algorithms, clustering techniques, and distributed computing concepts. Illustrated history of living the values necessary to Priceline: Customer, Innovation, Team, Accountability and Trust. The Right Results, the Right Way is not just a motto at Priceline; it s a way of life. Unquestionable integrity and ethics is essential . #LI-hybrid Who we are WE ARE PRICELINE. Our success as one of the biggest players in online travel is all thanks to our incredible, dedicated team of talented employees. Priceliners are focused on being the best travel deal makers in the world, motivated by our passion to help everyone experience the moments that matter most in their lives. Whether it s a dream vacation, your cousin s graduation, or your best friend s wedding - we make travel affordable and accessible to our customers. Our culture is unique and inspiring (that s what our employees tell us). We re a grown-up, startup. We deliver the excitement of a new venture, without the struggles and chaos that can come with a business that hasn t stabilized. We re on the cutting edge of innovative technologies. We keep the customer at the center of all that we do. Our ability to meet their needs relies on the strength of a workforce as diverse as the customers we serve. We bring together employees from all walks of life and we are proud to provide the kind of inclusive environment that stimulates innovation, creativity and collaboration. Priceline is part of the Booking Holdings, Inc. (Nasdaq: BKNG) family of companies, a highly profitable global online travel company with a market capitalization of over $80 billion. Our sister companies include Booking.com, BookingGo, Agoda, Kayak and OpenTable. If you want to be part of something truly special, check us out! Flexible work at Priceline Priceline is following a hybrid working model, which includes two days onsite as determined by you and your manager (ideally selecting among Tuesday, Wednesday, or Thursday). On the remaining days, you can choose to be remote or in the office. Diversity and Inclusion are a Big Deal! To be the best travel dealmakers in the world, it s important we have a workforce that reflects the diverse customers and communities we serve. We are committed to cultivating a culture where all employees have the freedom to bring their individual perspectives, life experiences, and passion to work. Priceline is a proud equal opportunity employer. We embrace and celebrate the unique lenses through which our employees see the world. We d love you to join us and add to our rich mix! Applying for this position Were excited that you are interested in a career with us. For all current employees , please use the internal portal to find jobs and apply. External candidates are required to have an account before applying. When you click Apply, returning candidates can log in, or new candidates can quickly create an account to save/view applications.

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Responsibilities SPF activities 1. Installation and setup of SPF. 2. Creation and maintenance of SQL database for SPF. 3. Integrating SPPID, SPEL, SPI and SP3D tools with SPF. 4. Perform publish/retrieve mapping for all integrated tools. 5. Identifying and setting up of user roles in SPF as per the task handled by individuals. 6. Setting up of PBS in SPF and Publishing PBS for other tools to retrieve. 7. Coordinate for PBS retrieval in SPPID, SPEL, SPI and SP3D tools. 8. Schema modelling and mapping in SPF like Authoring schema, published schema, graph def, View def, Adapter Tool Map schema and Merging of CMF with Integration schema. 9. Datasheets/lists Mapping. 10. Adhoc report customization as per project requirements. 11. Data validation reports as per project requirements. 12. SPF Admin items setup like methods, column sets, column items etc. 13. Excel publish schema and templates creation for SPF. 14. Monitor the publish Queue and fix any Schema and mapping errors. Web client activities 1. Create new entry points and Featureset as per project requirement. 2. Webclient extensibility as per project requirement. SDV activities 1. Configure Data Exchange from SPF to SDx. 2. SDV mappings to push/exchange data from SPF to SDx 3. Create Jobs to exchange the data. 4. Configuring data validation rules. 5. Monitor the jobs and communicate the errors to respective stakeholders Education / Qualifications Bachelor s degree in Engineering or a related field (master s preferred) SPFadmin/S3D admin or competitive products experinece

Posted 1 month ago

Apply

10.0 - 15.0 years

7 - 9 Lacs

Bengaluru

Work from Office

About the Role: The candidate is expected to bring great experience with respect to transactional databases like MySQL and a good exposure to NoSQL databases like MongoDB , Cassandra ETC . He will be working as an IC to provide long term solutions to tricky problems considering both business and tech demands.He would be working as a group as well as individual to move projects end to end with a long term vision In-depth MySQL knowledge and exposure to the new NOSQL technologies is a must. Duties and Responsibilities: Operate and enhance our large, highly-available database infrastructure, utilising cloud technologies . Able to lead technical architecture discussions and help drive technical decisions within your team. Be a strong contributor to the development of platform services including architecture, provisioning, configuration, deployment, and support React to production deficiencies by continuously implementing automation, self-healing, and real-time monitoring to production systems Demonstrated operations experience with Linux platform (i.e. Ubuntu, RHEL, OEL)including administration, management, and troubleshooting Familiarity with security practices in web application delivery and General knowledge of network topology Able to lead technical architecture discussions and help drive technical decision within your team. Partner with the distributed team in prototyping new database platform services Participate in our 24x7 on-call rotation and collaborate with our operations team to triage and resolve production issues. Requirements: You ve been working as a DBA/DBE with a DevOps mindset andskill to develop the automation, with 10+ years industry experience Extensive experience with database technologies MySQL. Experience in NoSQL datastores like Cassandra/MongoDB will be an added advantage. You possess experience with at least of the major cloud providers like Amazon Web Services,Microsoft Azure, Google Cloud. Proven expertise in database administration, to include a solid understanding ofrelated programming languages, clustering, back-up/restore technologies, replication, HA and security. Well versed in understanding MySQL internal memory and disk structures . Ability to solve complex problems at the Database and system level to provide an end toend solution . Able to handle critical escalations from a technical point of view having clear understanding of linux systems , clouds and databases. Experience in administration and maintenance in distributed environments Prior experience in understanding and developing automation using (Python /Perl /shell) . Very strong communication skills: explaining complex technical concepts to designers, support, and other engineers are natural for you. You enjoy helping onboard new team members, mentoring, and teaching others. Well versed with complex schema designs and solutioning for huge transactional databases and NOSQL databases. . " Who are we Myntra is India s leading fashion and lifestyle platform, where technology meets creativity. As pioneers in fashion e-commerce, we ve always believed in disrupting the ordinary. We thrive on a shared passion for fashion, a drive to innovate to lead, and an environment that empowers each one of us to pave our own way. We re bold in our thinking, agile in our execution, and collaborative in spirit. Here, we create MAGIC by inspiring vibrant and joyous self-expression and expanding fashion possibilities for India, while staying true to what we believe in. We believe in taking bold bets and changing the fashion landscape of India. We are a company that is constantly evolving into newer and better forms and we look for people who are ready to evolve with us. From our humble beginnings as a customization company in 2007 to being technology and fashion pioneers today, Myntra is going places and we want you to take part in this journey with us. Working at Myntra is challenging but fun - we are a young and dynamic team, firm believers in meritocracy, believe in equal opportunity, encourage intellectual curiosity and empower our teams with the right tools, space, and opportunities.

Posted 1 month ago

Apply

15.0 - 20.0 years

50 - 70 Lacs

Kalyani, Bengaluru

Work from Office

Please Note: 1. If you are a first time user, please create your candidate login account before you apply for a job. (Click Sign In > Create Account) 2. If you already have a Candidate Account, please Sign-In before you apply. Job Description: About Us Join the Avi Application Load Balancer Analytics team which plays a critical role in driving insights, performance optimization, and intelligent automation across our platforms. Beyond our daily responsibilities, our team has a proven history of innovation, with an impressive portfolio of multiple patents. We encourage and support each other in exploring new ideas and turning them into deployable solutions, fostering a culture of creativity and intellectual curiosity. We re looking for a seasoned Staff Engineer to lead complex analytics initiatives that intersect big data, AI/ML, distributed systems, and high-performance computing. This is an on-site position in Bangalore, India, where you will be part of a larger cross-border team of smart and motivated engineers located both in Bangalore, India and Palo Alto, USA while enjoying the autonomy and support to express yourself creatively and deliver at your best. What You ll Do Architect and lead the design of scalable, high-performance data analytics platforms and systems. Develop and optimize services using GoLang, C++, and Python for ingesting, processing, and analyzing massive volumes of telemetry and network data. Implement data pipelines and analytics workflows across SQL/NoSQL databases (e.g., PostgreSQL, TimescaleDB, Redis, etc.). Build and manage search and indexing systems using OpenSearch or Lucene , ensuring low-latency querying and efficient data retrieval. Design and enforce strong data modeling practices across structured and unstructured data sources. Collaborate closely with ML engineers to deploy AI/ML models for predictive analytics, anomaly detection, and intelligent insights. Must have - 15+ years of experience in software/data engineering, preferably within large-scale networking, cloud infrastructure, or data platforms. Expertise in GoLang, C++, and Python with production-level experience in performance-critical systems. Deep understanding of SQL and NoSQL databases , their tradeoffs, scaling strategies, and schema design. Strong hands-on experience with search technologies such as OpenSearch, Lucene , or Elasticsearch. Proven experience with data modeling , pipeline optimization, and data architecture. Solid foundation in AI/ML concepts with applied experience in deploying models into production analytics systems. Strong communication and leadership skills with a passion for mentorship and technical excellence. Nice to Have - A strong background in networking; with proven experience in building high performance networking appliances Experience in telemetry, network data analytics, or observability systems. Familiarity with Kubernetes, Kafka, Spark, or similar distributed systems technologies. Broadcom is proud to be an equal opportunity employer. We will consider qualified applicants without regard to race, color, creed, religion, sex, sexual orientation, national origin, citizenship, disability status, medical condition, pregnancy, protected veteran status or any other characteristic protected by federal, state, or local law. We will also consider qualified applicants with arrest and conviction records consistent with local law. If you are located outside USA, please be sure to fill out a home address as this will be used for future correspondence.

Posted 1 month ago

Apply

5.0 - 7.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Role: SEO Specialist with a strong focus on Off-Page Optimization and Link Building Experience : 5-7 Years hands-on experience in link building or off-page SEO Key Responsibilities: Conduct keyword research and develop optimization strategies aligned with business goals. Plan and execute ethical (white-hat) link building campaigns. Identify link opportunities through competitor backlink analysis and content gap research Perform outreach campaigns via email and LinkedIn to build relationships with webmasters, bloggers, and publishers. Maintain a diverse and balanced backlink profile by tracking domain authority, relevance, and anchor text distribution. Collaborate with the SEO and content teams to ensure link acquisition supports target pages, keywords, and business goals. Monitor new and lost backlinks, analyze impact, and report performance using Google Search Console and backlink monitoring tools. Execute local and geo-targeted link building strategies to strengthen regional SEO visibility Contribute to AEO efforts by promoting FAQ, schema-structured content, and snippet-worthy assets. Required Skills Qualifications: Strong understanding of Google Search ranking signals and link quality metrics Proficiency with SEO tools : Ahrefs, Semrush, BuzzStream, Hunter.io, Google Search Console Excellent communication and email outreach skills Ability to analyze backlink profiles, identify toxic links, and manage disavow files Experience with local SEO and geo-specific link strategies is a plus Familiarity with Answer Engine Optimization (AEO) and SERP feature optimization preferred Ability to prioritize tasks, meet deadlines, and manage SEO initiatives independently. Exposure to local SEO, schema markup, and SERP feature optimization is a plus.

Posted 1 month ago

Apply

4.0 - 6.0 years

10 - 14 Lacs

Mumbai

Work from Office

Manager : SEO and Growth Marketing Location: Mumbai or Remote Department: Marketing Experience Level: 4-6 years Employment Type: Full-Time Reports to : Associate Director - Digital Marketing Who We Are & What We Do At Disprz, we are on a mission to redefine workforce learning. As a leading AI-powered Learning and Skilling Suite, we help enterprises unlock their employees potential through personalized learning paths, deep skill insights, and real-world content. Trusted by 350+ enterprises globally, we empower HR and L&D teams to drive business outcomes through impactful upskilling and performance solutions. About the Role: We re looking for a strategic, hands-on SEO & Growth Marketing Manager to lead Disprz s organic growth engine. This isn t your traditional SEO role. You ll be at the cutting edge of search innovation applying not just technical SEO and content strategy, but also Answer Engine Optimization (AEO) , Generative Engine Optimization (GEO) , and AI-optimized content (AISO) to fuel next-level discoverability and lead generation. Youll play a pivotal role in driving qualified traffic, boosting our visibility across both classic and AI-powered search platforms, and optimizing the user journey to convert visitors into high-intent leads. This role is perfect for someone who s excited to experiment, iterate, and win in a fast-evolving search landscape. Key Responsibilities: Lead the full-spectrum SEO strategy from technical audits and content-led SEO to advanced on-page optimization, AEO, GEO, and AISO. Own organic traffic and lead growth targets : Uncover ranking opportunities for high-intent keywords across key markets (India, SEA, Middle East). Perform ongoing keyword research , content gap analysis, and competitive benchmarking to drive data-backed content decisions. Collaborate with web and design teams to build SEO-optimized, high-converting landing pages and resource hubs . Drive conversion rate optimization (CRO) through A/B testing, heatmaps, and performance audits across organic entry points. Partner with content and product marketing teams to scale search-optimized content (blogs, guides, product pages). Support organic brand-building initiatives from influencer marketing and LinkedIn thought leadership to YouTube SEO. Stay ahead of the curve on Google algorithm updates , AI-driven search trends, and evolving user behavior and translate insights into action. Required Skills & Qualifications 3-5 years of hands-on experience in B2B SEO, growth marketing, or content-led acquisition Strong foundation in technical SEO, schema markup, AEO, GEO, and AI-optimized content practices Proficiency in SEO tools such as Ahrefs, Semrush, GA4, Clarity, Hotjar etc Experience in social media marketing and influencer marketing across platforms like LinkedIn and YouTube Bonus: Experience in ABM, email marketing, or integrating SEO into full-funnel demand gen strategies Why This Role is Exciting This is your chance to not just keep up with SEO evolution but to lead it. At Disprz, you ll experiment with how people discover content in a world of AI-driven search, voice answers, and zero-click journeys. You ll have the freedom to innovate, test bold ideas, and leave your mark on a fast-growing global SaaS brand. Why Join Us Competitive salary and benefits package. Opportunities for career growth and professional development. A collaborative, forward-thinking work environment. Flexible working hours and the option for hybrid/remote work. Be a key part of a creative and dynamic team driving innovation.

Posted 1 month ago

Apply

4.0 - 6.0 years

4 - 8 Lacs

Chennai

Work from Office

We re seeking a results-driven Senior Data Analyst to lead the development of scalable data pipelines, optimize segmentation strategies, and deliver actionable insights that power our marketing and revenue teams. You ll collaborate across Marketing, Sales, and Tech to align data initiatives with business growth. Key Responsibilities: Strategic Data Acquisition: Lead initiatives to source high-quality B2B data from diverse platforms using defined ICP frameworks. Own scraping strategy and compliance standards. Data Automation & Scalability: Design and maintain Python-based automation to support bulk data extraction, transformation, and enrichment at scale. Database Hygiene & Integrity: Oversee data cleaning, deduplication, validation, and schema structuring to ensure quality and consistency across systems. Segmentation for Growth: Build and manage data models that power hyper-targeted outreach and campaign personalization. Enable dynamic audience segmentation. Insight Generation & Reporting: Translate complex datasets into digestible dashboards and reports using SQL, Excel/BI tools. Identify trends, gaps, and opportunities to inform GTM strategy. Cross-Functional Leadership: Collaborate with Marketing and Sales to refine ICPs, inform ABM strategies, and improve lead scoring models. Recommend process improvements based on analytics. Requirements: Experience: 3-5 years in a data-focused role, preferably supporting marketing, sales, or growth teams. Technical Proficiency: Proficiency in Python for data scraping and automation. Basic SQL (MySQL or similar) for data querying and database management. Familiarity with Excel, Power BI, or other visualization tools. Domain Knowledge: Strong understanding of B2B marketing funnels, lead scoring, and ICP-driven data sourcing. Proven track record of enabling targeted marketing initiatives through data-driven insights. Soft Skills: Excellent communication and stakeholder management Ability to simplify complex findings and drive data adoption across teams If you re a strategic thinker who thrives on turning raw data into growth opportunities, we d love to speak with you.

Posted 1 month ago

Apply

8.0 - 12.0 years

25 - 30 Lacs

Bengaluru

Work from Office

About Tarento: Tarento is a fast-growing technology consulting company headquartered in Stockholm, with a strong presence in India and clients across the globe. We specialize in digital transformation, product engineering, and enterprise solutions, working across diverse industries including retail, manufacturing, and healthcare. Our teams combine Nordic values with Indian expertise to deliver innovative, scalable, and high-impact solutions. Were proud to be recognized as a Great Place to Work, a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, you ll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose. About the Role: We are looking for a Backend Developer with hands-on experience in building and maintaining backend systems using Python and FastAPI , with MongoDB as the primary database. Our application stack is lean and the backend logic is straightforward, but we move fast and value clean, maintainable code. You will play a key role in a small but agile team that delivers features quickly while maintaining production-grade stability. Key Responsibilities: Design, develop, and maintain RESTful APIs using FastAPI Integrate MongoDB for data storage and retrieval Write clean, well-structured, and testable code Work closely with frontend and product teams to ship features rapidly Participate in code reviews, debugging, and performance tuning Ensure timely delivery in a fast-paced development environment Requirements Strong experience in Python , especially with FastAPI or similar frameworks (Flask, Django, etc.) Experience with MongoDB (schema design, indexing, aggregation, etc.) Familiarity with API design best practices Ability to write clean and maintainable code with proper documentation Comfortable with Git and collaborative development Ability to adapt quickly and work under tight deadlines Good communication skills and team-player attitude

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Pune

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to design and implement data platform solutions.- Develop and maintain data pipelines for efficient data processing.- Optimize data storage and retrieval processes.- Implement data security measures to protect sensitive information.- Conduct performance tuning and troubleshooting of data platform components. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of cloud data platforms like AWS or Azure.- Experience with SQL and database management systems.- Hands-on experience with ETL tools for data integration.- Knowledge of data modeling and schema design. Additional Information:- The candidate should have a minimum of 3 years of experience in Snowflake Data Warehouse.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Kolkata

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP HCM Payroll Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and implement SAP HCM Payroll solutions.- Collaborate with cross-functional teams to analyze and address business requirements.- Conduct testing and debugging of applications to ensure optimal performance.- Provide technical support and guidance to end-users.- Stay updated on industry trends and best practices to enhance application development processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM Payroll.- Strong understanding of SAP HR modules.- Experience in SAP Payroll configuration and customization.- Knowledge of ABAP programming language.- Hands-on experience in SAP Payroll schema and rules configuration. Additional Information:- The candidate should have a minimum of 3 years of experience in SAP HCM Payroll.- This position is based at our Kolkata office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies