Jobs
Interviews

89 Cosmosdb Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Web API Developer at Vagaro Technologies Pvt Ltd, you will be responsible for leveraging your 3-6 years of experience to contribute to the development of our all-in-one business management platform and online marketplace for the salon, spa, and fitness industries. Your role will involve working with a talented team of professionals to design and implement robust RESTful APIs that cater to the needs of both businesses and end-users. Your key responsibilities will include: - Demonstrating strong expertise in C#, .NET Core, ASP.NET Core, and the ability to build efficient RESTful APIs. - Collaborating with front-end developers to create cohesive and user-friendly APIs that align with business requirements. - Implementing secure authentication mechanisms such as OAuth2 and JWT to ensure the protection of sensitive data. - Designing API structures with a focus on scalability, performance, and maintainability. - Integrating APIs with databases like SQL Server or NoSQL databases such as MongoDB and CosmosDB. - Writing clean and maintainable code following coding standards and best practices. - Performing code reviews, unit testing, and assisting in integration testing of APIs. - Creating detailed API documentation using tools like Swagger, Readme, Postman, or API Blueprint. - Collaborating with cross-functional teams to define and execute API specifications. - Ensuring that APIs are well-documented and easy for third-party developers to integrate with. - Implementing robust security practices throughout the API development lifecycle, including proper use of HTTPS, authentication, and authorization. - Debugging and troubleshooting issues in both production and non-production environments. - Providing post-deployment support and monitoring for issue resolution, fixes, and performance optimization. This role requires familiarity with web API frameworks, middleware, object-oriented programming principles, SOLID design patterns, and best practices. Additionally, you should have experience working with microservices architecture, API versioning, error handling, and logging. Your ability to work with various databases, conduct unit testing, and ensure API security and compliance will be crucial to your success in this position. Join us at Vagaro Technologies Pvt Ltd and be a part of a dynamic team that values work-life balance, offers attractive perks, and encourages professional growth.,

Posted 1 month ago

Apply

6.0 - 11.0 years

5 - 10 Lacs

Pune, Chennai, Bengaluru

Work from Office

DotNet Core, Angular, Azure PAAS development (Logic Apps, Azure Functions, and Cosmodb)

Posted 1 month ago

Apply

2.0 - 6.0 years

4 - 9 Lacs

Bengaluru, Karnataka, India

On-site

Azure DevOps Engineer with minimum 8 years of relevant work experience DevOps experience building pipelines and infrastructure (Mandate to have) Build pipelines with azure DevOps and GitHub. Knowledge on bicep and Yaml to build pipelines in ADO Basic Azure Fundaments certification Experience working with Azure Services like App Gateway, WAF, Network Security Groups, Storage Accounts, CosmosDB, etc. Experience with Azure resources/concepts like vNet, managed identity, KeyVault, AppConfig, App Insights, Monitoring Alerts, Deployment Logs, Approval Gates, Deployment Cycles, Branching Strategies Good communication skills in English Roles and responsibility Taking care or manage of Dynamics Deployment Taking care or manage of proxy deployments (running ADO pipelinesGitHub actions) Update the ado board accordingly to keep the deployed work items updated. Co-ordinating with QA team to get the smoke tests run post deployment Communicating on different teams channel on deployment. At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive.

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

Genpact is a global professional services and solutions firm dedicated to delivering outcomes that shape the future. With a workforce of over 125,000 professionals spanning across more than 30 countries, we are fueled by our innate curiosity, entrepreneurial agility, and commitment to creating lasting value for our clients. Our purpose, the relentless pursuit of a world that works better for people, drives us to serve and transform leading enterprises, including the Fortune Global 500, leveraging our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Principal Consultant- Databricks Lead Developer. As a Databricks Developer in this role, you will be tasked with solving cutting-edge real-world problems to meet both functional and non-functional requirements. Responsibilities: - Keep abreast of new and emerging technologies and assess their potential application for service offerings and products. - Collaborate with architects and lead engineers to devise solutions that meet functional and non-functional requirements. - Demonstrate proficiency in understanding relevant industry trends and standards. - Showcase strong analytical and technical problem-solving skills. - Possess experience in the Data Engineering domain. Qualifications we are looking for: Minimum qualifications: - Bachelor's Degree or equivalency in CS, CE, CIS, IS, MIS, or an engineering discipline, or equivalent work experience. - <<>> years of experience in IT. - Familiarity with new and emerging technologies and their possible applications for service offerings and products. - Collaboration with architects and lead engineers to develop solutions meeting functional and non-functional requirements. - Understanding of industry trends and standards. - Strong analytical and technical problem-solving abilities. - Proficiency in either Python or Scala, preferably Python. - Experience in the Data Engineering domain. Preferred qualifications: - Knowledge of Unity catalog and basic governance. - Understanding of Databricks SQL Endpoint. - Experience with CI/CD for building Databricks job pipelines. - Exposure to migration projects for building Unified data platforms. - Familiarity with DBT, Docker, and Kubernetes. If you are a proactive individual with a passion for innovation and a strong commitment to continuous learning and upskilling, we invite you to apply for this exciting opportunity to join our team at Genpact.,

Posted 1 month ago

Apply

6.0 - 10.0 years

14 - 24 Lacs

Gurugram

Work from Office

Strong experience in C#, .net core, Web API, SQL Server, Entity Framework, Azure PaaS (Cosmosdb, Storage accounts, Key vault, Managed Identity)

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

You will be responsible for developing applications using various Microsoft and web development technologies such as ASP.Net, C#, MVC, Web Forms, Angular, SQL Server, T-SQL, and Microservices. Your expertise in big data technologies like Hadoop, Spark, Hive, Python, Databricks, etc. will be crucial for this role. With a Bachelors Degree in Computer Science or equivalent experience through higher education, you should have at least 8 years of experience in Data Engineering and/or Software Engineering. Your strong coding skills along with knowledge of infrastructure as code and automating production data and ML pipelines will be highly valued. You should be proficient in working on on-prem to cloud migration, particularly in Azure, and have hands-on experience with Azure PaaS offerings such as Synapse, ADLS, DataBricks, Event Hubs, CosmosDB, Azure ML, etc. Experience in building, governing, and scaling data warehouses/lakes/lake houses is essential for this role. Your expertise in developing and tuning stored procedures and T-SQL scripting in SQL Server, along with familiarity with various .Net development tools and products, will contribute significantly to the success of the projects. You should be adept with agile software development lifecycle and DevOps principles to ensure efficient project delivery.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

As a MySQL DBA Lead Engineer, you will be responsible for managing and optimizing databases in a cloud environment. With over 15 years of experience, you will lead the team in ensuring the smooth operation of databases like MySQL, PostgreSQL, SQL Server, and AWS RDS Aurora. Your expertise in SQL, NoSQL, and various databases will be essential in maintaining high performance and reliability. Your role will involve advanced database management tasks such as backup, recovery, and tuning. You will have hands-on experience with MySQL, PostgreSQL, and MariaDB, including installation, configuration, and fine-tuning. Additionally, you will be proficient in MySQL replication concepts and performance tuning, ensuring optimal database performance. In the AWS cloud environment, you will demonstrate your expertise in managing MariaDB in EC2 and RDS, as well as PostgreSQL RDS and Aurora. Your skills in database services like RDS MySQL, Aurora MySQL, and Redshift will be crucial in configuring, installing, and managing databases for efficient operation. Moreover, your experience in migration projects from on-premise to cloud environments will be valuable. You will be adept at managing SQL Server databases and ensuring their smooth operation across different life cycle environments. Your exposure to project deliverables and timely delivery will play a key role in project success. Overall, your extensive experience with database technologies, cloud services, and operating systems will be instrumental in ensuring the reliability and performance of databases in a cloud environment.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Are you passionate about bringing systems to life Are you skilled at problem-solving and finding solutions Do you have a keen interest in leading people to enhance digital client on-boarding processes using technology and driving revenue, profitability, and NPS for the firm As a Java Engineer, your primary responsibilities will include designing, developing, implementing, and managing technology platforms in the client onboarding/lifecycle management areas. You will apply a broad range of full-stack development, security, reliability, and integration technologies on the Azure platform to ensure the delivery of a robust and scalable platform. Additionally, you will integrate with various systems such as advisory workstations, client due-diligence systems, case management systems, data/document management, and workflow management to provide a seamless experience for both clients and advisors. Managing the platform's technical roadmap to incorporate modern capabilities for delivering business value will also be a key aspect of your role. In terms of technology leadership and relationship management, you will be responsible for developing and fostering partnerships with cross-functional teams, including banking and wealth management businesses, risk/regulatory/compliance, records management, cloud infrastructure, security, and architecture office to ensure the platform meets the firm's requirements. You will be part of the Client Data and Onboarding Team in India, serving as an engineering leader. This team is responsible for Wealth Management Americas" client-facing technology applications. Collaboration with teams in the US and India will be essential, and you will play a crucial role in ensuring the adoption of scalable development methodologies across multiple teams. Your involvement in strategy discussions with business and technology architects will contribute to the team's success. The team's culture emphasizes innovation, partnership, transparency, and a shared passion for the future. UBS is committed to fostering diversity, equity, and inclusion as it believes that diversity strengthens the business and adds value to its clients. To excel in this role, you should possess: - Hands-on expertise in designing, developing, and delivering large, scalable, and distributed systems - Experience in Java/J2EE, Kafka, REST APIs, microservices, and event-driven architecture - Working knowledge of application frameworks such as Spring Boot & Micronaut - Good understanding of cloud technologies, especially Docker, Kubernetes, and other cloud-native services in Microsoft Azure - Knowledge in React JS/Node JS, SQL, data analysis skills, and experience in NoSQL databases like CosmosDB or MongoDB - Proficiency in UNIX/Linux environments and scripting skills, including scheduling/automation tools like AutoSys or TWS - Excellent communication skills, teamwork, and the ability to lead a team across geographies - Familiarity with Agile development processes and tools, analytical problem-solving skills, and debugging capabilities - Good at team coaching/management and tuned to the finance industry and service provider culture UBS is the world's largest and only truly global wealth manager, operating through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management, and the Investment Bank. With a presence in more than 50 countries, UBS stands out for its global reach and expertise. If you are enthusiastic about joining a diverse team where collaboration is key and where your skills and background are valued, UBS offers a supportive environment with opportunities for growth, new challenges, and flexible working options whenever possible. The inclusive culture at UBS encourages employees to bring out their best at every stage of their career journey. If you require reasonable accommodations throughout the recruitment process due to disability, feel free to reach out to us. UBS is an Equal Opportunity Employer that respects and empowers each individual, supporting diverse cultures, perspectives, skills, and experiences within its workforce.,

Posted 1 month ago

Apply

3.0 - 6.0 years

3 - 15 Lacs

Hyderabad, Telangana, India

On-site

6+ year of Experience in C# and the .NET Framework. 3+ experience in .Net Core. 3+ years of experience with a microservices architecture. 3+ years of Azure cloud experience. Strong Experience with Azure Functions, CosmosDB, Service Bus and Event Hub. Strong Experience with DevSecOps in Azure Space using GitHub. Strong Experience with agile delivery model. Mandatory skills.NET, C#, Azure PaaS cloud development Desired skills*. Design patterns. SOLID principles. Microservices patterns. Kafka.

Posted 2 months ago

Apply

3.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Dot Net Full Stack Developer- Gen AI at our technology services client, you will be part of a dynamic team on a contract basis with a strong potential for conversion to full-time employment. Your primary location options include Bangalore, Noida, or Pune, and the ideal candidate will be able to start within an immediate to 15-day notice period. Your role will involve working with Microservice and Event-Driven architectures, utilizing skills in C#, .NET 7.0+/CORE, SQL/No SQL, Web API, and various GenAI models from OpenAI. While not mandatory, experience in Python, Django, Flask, React, and AngularJS would be beneficial. To excel in this position, you should hold a Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Your responsibilities will include developing web applications, customer-facing integrations, and Cloud Native development using Azure. A minimum of 3 years of experience with various GenAI models is required, along with knowledge of Agentic AI/Frameworks and Azure services such as Azure Data Factory, Logic Apps, and functions. Your tasks will encompass technical design, development, and support of web applications and API frameworks, integrating with different systems, AI models, and orchestration layers. Proficiency in ORM Tools, Azure Stack, CI/CD, container deployments, and Git code repository is essential. Hands-on experience in C# .NET Core, CosmosDB, Web Application Development, Web APIs, and knowledge of REST APIs and data structures are crucial. Additionally, you should possess good debugging and analytical skills, familiarity with design patterns, code reviews, caching strategies, and the ability to analyze client requirements and propose solutions. Strong communication skills are necessary for effectively conveying information to peers, leadership, and customers via email, phone, or in-person interactions. If you meet these requirements and are interested in this opportunity, please share your updated resume with us at sathwik@s3staff.com. We look forward to potentially welcoming you to our team.,

Posted 2 months ago

Apply

4.0 - 9.0 years

6 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Proficient in Azure Cloud Application Development, modernization, implementation, deployment and support of highly distributed applications leveraging .NET, C# and client technologies like JQuery, Angular/React. Azure API development and integration experience and familiarity with RESTful Micro services methodologies for integrating these into Azure cloud-based development Experience in CosmosDB Knowledge in developing applications using mordent front end technologies like AngularJS, ReactJS, HTML5, CSS3 etc. Experience in customer-facing roles leading technical architecture and application design discussions with clients to drive cloud deployment Knowledge and experience with AGILE development, SCRUM and Application Lifecycle Management (ALM)

Posted 2 months ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Senior Full-Stack Developer at INNOFarms.AI, you will play a crucial role in developing and scaling a cutting-edge AI-powered platform for indoor vertical farming and precision AgTech. Your mission will be to leverage Artificial Intelligence, Robotics, IoT, and Cloud technologies to revolutionize agriculture, addressing key challenges related to food security, sustainability, and climate impact. Working with us, you will have the opportunity to tackle high-impact, real-world issues in agriculture and sustainability. You will collaborate with a world-class leadership team to build and scale our product-market fit version, taking direct ownership of the full-stack architecture across AI and Cloud solutions. This role offers a rapid growth trajectory with the potential to lead engineering teams globally. Key Responsibilities: - Design, develop, and scale the INNOFarms.AI SaaS platform for smart farming and automation - Lead full-stack development using React.js/Next.js for the frontend and Node.js/Python for the backend - Integrate AI-driven insights and real-time data from IoT devices into the FarmXOS platform - Build scalable APIs and microservices using REST, GraphQL, and WebSockets - Optimize system performance, security, and global deployment across multiple regions - Collaborate with AI, IoT, and Robotics teams to enable intelligent farm control and automation Core Skills & Requirements: - 4+ years of hands-on experience in full-stack development - Proficiency in React.js, Next.js, TypeScript for frontend development - Strong backend skills in Node.js (Python knowledge is a plus) - Experience with various databases such as SQL, MongoDB, and CosmosDB - Familiarity with Cloud & DevOps tools like Azure, Docker, and Kubernetes - Expertise in API Architecture including REST, GraphQL, and WebSockets - Prior experience in fast-paced startup environments - Possess a strong product mindset and business scalability acumen Preferred (Nice to Have): - Experience in AI-driven platforms, AgTech, SaaS, or IoT domains - Previous work involving AI/ML cloud infrastructure - Passion for sustainable technology and climate innovation Location: Gurugram (preferred for in-person collaboration) Join Timeline: Immediate or within 4 weeks preferred,

Posted 2 months ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Senior Full-Stack Developer at INNOFarms.AI, you will be responsible for designing, developing, and scaling our AI-powered SaaS platform for indoor vertical farming and precision AgTech. You will have the opportunity to work on high-impact, real-world problems in agriculture and sustainability, collaborating with a world-class leadership team to build and scale our product-market fit version. With direct ownership of the full-stack architecture across AI and Cloud solutions, you will play a key role in integrating AI-driven insights and real-time data from IoT devices into our FarmXOS platform. Your key responsibilities will include leading full-stack development using React.js/Next.js for the frontend and Node.js/Python for the backend. You will build scalable APIs and microservices, optimize system performance and security, and enable global deployment across APAC, UAE, and the US. Collaboration with AI, IoT, and Robotics teams will be essential to enable intelligent farm control and automation. The ideal candidate for this role will have at least 4 years of hands-on experience in full-stack development, with proficiency in frontend technologies such as React.js, Next.js, TypeScript, and backend technologies including Node.js (Python is a plus). Experience with databases like SQL, MongoDB, and CosmosDB, as well as Cloud & DevOps tools like Azure, Docker, and Kubernetes is required. Strong API architecture skills using REST, GraphQL, and WebSockets are essential for this role. Experience in fast-paced startup environments is a must, with a strong product mindset and a focus on business scalability. Preferred qualifications include prior experience in AI-driven platforms, AgTech/SaaS/IoT domains, or AI/ML cloud infrastructure. A passion for sustainable technology and climate innovation is also desirable. This is a full-time position based in Gurugram, with in-person collaboration preferred. Immediate joiners or those able to join within 4 weeks are preferred to join our rapidly growing team at INNOFarms.AI.,

Posted 2 months ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Cloud DB Engineer (MYSQL DBA) with over 15 years of experience, you will be based at Chennai One IT Park, Thoraipakkam, working on-site all 5 days of the week in a WFO setup. We are looking for immediate joiners or candidates who can serve a maximum notice period of 2 weeks. Your primary responsibilities will revolve around MYSQL DBA and strong experience in Administration. You will be leading as a MYSQL DBA Engineer, working in a 24*7 shift environment. Your key skills should include MYSQL DBA, Postgress, SQL Server, AWS RDS Aurora. In this role, you must showcase strong technical expertise in SQL, NoSQL, and various databases such as PostgresDB, SQL Server, CosmosDB, MongoDB, Cassandra, and Cloud DB. With a minimum of 10+ years of experience in production DB management, you should have hands-on experience in relational and non-relational database administration, advanced DB management, backup/recovery, and tuning. Your expertise should extend to MySQL, Postgresql, and Maria database installation, creation, configuration, and fine-tuning. Specifically, you should be well-versed in MySQL versions (5.6.x, 5.7.x, 8.x), installation, configuration, monitoring, backup and recovery, point in time recovery, replication concepts, performance tuning, query tuning, and DB version upgrades. Experience with MariaDB versions (10.x, 11.x) is also required, including installation, version upgrades, replication, backup and restore, performance tuning, query tuning, and monitoring. Proficiency in AWS cloud MariaDB in ec2 and RDS is a must. Additionally, you should have expertise in AWS PostgreSQL RDS, Aurora, Redshift configuration, installation, backup and restore, DB version upgrades, user management, and database migration. You will be responsible for migrating databases from physical servers to AWS cloud DB services, configuring ETL replication from RDS to Redshift through AWS DMS, and handling migration/upgradation projects from On-premise to cloud. Managing SQL Server databases through different life cycle environments and ensuring timely project delivery are also part of your responsibilities. Moreover, your role will involve MongoDB installation, backup and restore, user management, and familiarity with AWS document DB. Experience with Linux and Windows Server environments, various database technologies, and cloud services like AWS and Microsoft Azure will be advantageous. If you meet the requirements and are ready to take on this challenging role, please apply by sending your resume to hr@letzbizz.com.,

Posted 2 months ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

Are you passionate about bringing systems to life Do you excel in problem-solving and finding innovative solutions Are you eager to take the lead in enhancing digital client onboarding processes through technology to drive revenue, profitability, and NPS for the firm We are seeking a Senior Java Engineer to join our team and contribute to the development and delivery of an enterprise digital Client due diligence platform, which includes Initial Due Diligence (KYC/AML) and Periodic due diligence (PKR) platforms. Your primary responsibilities will include: - Designing, developing, implementing, and managing technology platforms within the client onboarding/lifecycle management domains - Utilizing a wide range of full-stack development, security, reliability, and integration technologies on the Azure platform to ensure the delivery of a robust and scalable platform - Integrating with various systems such as advisory workstations, client due-diligence systems, case management systems, data/document management, and workflow management to provide a seamless experience for clients and advisors - Managing the technical roadmap of the platform to continuously evaluate and onboard modern capabilities that deliver business value - Establishing and nurturing partnerships with cross-functional teams, including banking and wealth management businesses, risk/regulatory/compliance, records management, cloud infrastructure, security, and architecture office to align the platform with the firm's requirements As part of your role, you will be responsible for platforms and projects related to Periodic KYC Reviews, working in Pune, India, and collaborating with team members across the US, Poland, and India. Key Requirements: - 8+ years of experience in Java/J2EE, React JS, Kafka, REST APIs, microservices, and event-driven architecture - Strong hands-on expertise in designing, developing, and delivering large, scalable, and distributed systems - Familiarity with application frameworks like Spring Boot & Micronaut - Good understanding of cloud technologies, particularly Docker, Kubernetes, and other cloud-native services, preferably in Microsoft Azure - Knowledge of Azure Data Factory (ADF), Azure Data Lake (ADLS), and Databricks is a plus - Proficiency in SQL and data analysis, with experience in NoSQL databases like CosmosDB or MongoDB being advantageous - Solid UNIX/Linux experience and scripting skills, including scheduling/automation tools like AutoSys or TWS - Excellent communication skills, a team player, and experience working in Agile development processes - Strong problem-solving and debugging skills, with the ability to lead a team across geographies and manage a business-critical platform - Familiarity with the finance industry and service provider culture Join us at UBS, the world's largest global wealth manager, and be part of a diverse, inclusive, and dynamic team that values collaboration and innovation. If you are ready to make an impact and be part of #teamUBS, apply now and explore opportunities for professional growth and development.,

Posted 2 months ago

Apply

3.0 - 8.0 years

10 - 16 Lacs

Gurugram

Hybrid

Opticom technology develop marketing tools for the automotive industry. We are seeking a talented senior developer experienced in C#, .NET and Azure services to join our team. This is a full time remote role. You will be responsible for advancing our content management service and will work collaboratively with our cross-functional team. You must work at least 5-6 hours per day in CET timezone (3.5 hours behind) Qualifications 3+ years of experience as a .Net C# Developer 3+ years of experience with React Hands-on experience in designing, developing, and deploying web applications using Asp.net Core, Restful APIs, or similar stacks Strong experience in database design, usage, and optimization using SQL, Cosmos, or other relational databases Knowledge of modern development tools, such as Git, Jira, Slack, etc. Bachelor's degree or higher in Computer Science or a related field Ability to work collaboratively in a fast-paced, dynamic team environment Excellent verbal and written communication skills with strong attention to detail

Posted 2 months ago

Apply

7.0 - 11.0 years

0 Lacs

coimbatore, tamil nadu

On-site

As a Database Engineer in the Production Support Team, you will be responsible for maintaining, troubleshooting, and optimizing the clients" database systems to ensure maximum availability and performance. Working closely with cross-functional teams, you will address production issues promptly and implement solutions to prevent recurrence. The ideal candidate possesses a strong background in SQL and experience with various database technologies, including CosmosDB, Azure Data Factory, Salesforce Database, etc. Key Responsibilities: - Monitor and maintain the health and performance of database systems, such as CosmosDB, Azure Synapse Analytics, Azure Data Factory, Salesforce Database, etc. - Respond promptly to production issues, troubleshoot database-related problems, minimize downtime, and ensure service continuity. - Collaborate with developers, system administrators, and stakeholders to implement database changes and optimizations. - Develop and maintain scripts for database automation, backups, and routine maintenance tasks. - Perform database capacity planning and recommend scaling strategies for accommodating growth. - Implement and enforce security best practices to safeguard sensitive data stored in databases. - Manage data migration projects ensuring smooth and accurate transfer of data between systems. - Utilize JIRA or similar tools to track and prioritize database-related tasks, incidents, and enhancements. - Document database configurations, procedures, and troubleshooting steps for knowledge sharing and training purposes. - Participate in on-call rotation and provide after-hours support as needed. Skills and Qualifications: - Bachelor's degree in Computer Science, Information Technology, or related field. - Proven experience as a Database Engineer or in a similar role with a focus on production support. - Strong proficiency in SQL and experience with database query optimization techniques. - Hands-on experience with CosmosDB, Azure Synapse Analytics, Azure Data Factory, Salesforce Database, or similar technologies. - Familiarity with JIRA or other issue tracking systems for task management. - Experience with data migration projects, including schema mapping, ETL processes, and validation. - Knowledge of database monitoring and performance tuning tools. - Understanding of database security principles and best practices. - Excellent troubleshooting and problem-solving skills with attention to detail. - Strong communication skills. Experience: 7+ Years Job Type: Full-time Schedule: Day shift Work Location: In person,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

Genpact is a global professional services and solutions firm focused on delivering outcomes that shape the future. With over 125,000 employees in more than 30 countries, we are driven by curiosity, agility, and the desire to create lasting value for our clients. Our purpose is the relentless pursuit of a world that works better for people, serving and transforming leading enterprises, including Fortune Global 500 companies, through deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Lead Consultant-Databricks Developer - AWS. As a Databricks Developer in this role, you will be responsible for solving cutting-edge real-world problems to meet both functional and non-functional requirements. Responsibilities: - Stay updated on new and emerging technologies and explore their potential applications for service offerings and products. - Collaborate with architects and lead engineers to design solutions that meet functional and non-functional requirements. - Demonstrate knowledge of relevant industry trends and standards. - Showcase strong analytical and technical problem-solving skills. - Possess excellent coding skills, particularly in Python or Scala, with a preference for Python. Qualifications: Minimum qualifications: - Bachelor's Degree in CS, CE, CIS, IS, MIS, or an engineering discipline, or equivalent work experience. - Stay informed about new technologies and their potential applications. - Collaborate with architects and lead engineers to develop solutions. - Demonstrate knowledge of industry trends and standards. - Exhibit strong analytical and technical problem-solving skills. - Proficient in Python or Scala coding. - Experience in the Data Engineering domain. - Completed at least 2 end-to-end projects in Databricks. Additional qualifications: - Familiarity with Delta Lake, dbConnect, db API 2.0, and Databricks workflows orchestration. - Understanding of Databricks Lakehouse concept and its implementation in enterprise environments. - Ability to create complex data pipelines. - Strong knowledge of Data structures & algorithms. - Proficiency in SQL and Spark-SQL. - Experience in performance optimization to enhance efficiency and reduce costs. - Worked on both Batch and streaming data pipelines. - Extensive knowledge of Spark and Hive data processing framework. - Experience with cloud platforms (Azure, AWS, GCP) and common services like ADLS/S3, ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases. - Skilled in writing unit and integration test cases. - Excellent communication skills and experience working in teams of 5 or more. - Positive attitude towards learning new skills and upskilling. - Knowledge of Unity catalog and basic governance. - Understanding of Databricks SQL Endpoint. - Experience in CI/CD to build pipelines for Databricks jobs. - Exposure to migration projects for building Unified data platforms. - Familiarity with DBT, Docker, and Kubernetes. This is a full-time position based in India-Gurugram. The job posting was on August 5, 2024, and the unposting date is set for October 4, 2024.,

Posted 2 months ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Bengaluru, Karnataka, India

On-site

Bullet-Pointed JD: Design, architect, and develop solutions using cloud big data technology to ingest, process, and analyze large, disparate data sets Develop systems to ingest, cleanse, normalize datasets and build pipelines from various sources, structuring previously unstructured data Collaborate with internal teams and external professionals to gather requirements and identify data development opportunities Understand and map data flow across applications like CRM, Broker & Sales tools, Finance, HR, etc. Unify, enrich, and analyze diverse data to generate insights and business opportunities Design and develop data management and persistence solutions using relational and non-relational databases Create POCs to validate solution proposals and support migration initiatives Build data lake solutions to store structured and unstructured data from multiple sources and guide teams in adopting modern tech platforms Follow CI/CD processes and best practices in development to strengthen data engineering discipline Mentor team members and contribute to overall organizational growth What we are looking for: 6+ years of experience and a bachelor's degree in Information Science, Computer Science, Mathematics, Statistics, or a related quantitative field Hands-on engineer with curiosity for technology and adaptability to evolving tech landscapes Understanding of Cloud Computing (AWS, Azure preferred), Microservices, Streaming Technologies, Network, and Security 3+ years of development experience using Python-Spark, Spark Streaming, Azure SQL Server, Cosmos DB/MongoDB, Azure Event Hubs, Azure Data Lake Storage, Azure Search Design and develop data management and persistence solutions with a focus on enhancing data processing capabilities Build, test, and improve data curation pipelines integrating data from DBMS, file systems, APIs, and streaming systems for KPI and metric development Maintain platform health, monitor workloads, and act as SME for assigned applications in collaboration with Infrastructure Engineering teams Team player with a self-motivated, reliable, and disciplined work ethic, capable of managing multiple projects 3+ years of experience with source code control systems and CI/CD tools Independent and capable of managing, prioritizing, and leading workload efficiently

Posted 2 months ago

Apply

0.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant- Databricks Developer ! In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements. Responsibilities Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have experience in Data Engineering domain . Qualifications we seek in you! Minimum qualifications Bachelor&rsquos Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience. Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have excellent coding skills either Python or Scala, preferably Python. Must have experience in Data Engineering domain . Must have implemented at least 2 project end-to-end in Databricks. Must have at least experience on databricks which consists of various components as below Delta lake dbConnect db API 2.0 Databricks workflows orchestration Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have good understanding to create complex data pipeline Must have good knowledge of Data structure & algorithms. Must be strong in SQL and sprak-sql . Must have strong performance optimization skills to improve efficiency and reduce cost . Must have worked on both Batch and streaming data pipeline . Must have extensive knowledge of Spark and Hive data processing framework. Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB /DynamoDB, ASB/SQS, Cloud databases. Must be strong in writing unit test case and integration test Must have strong communication skills and have worked on the team of size 5 plus Must have great attitude towards learning new skills and upskilling the existing skills. Preferred Qualifications Good to have Unity catalog and basic governance knowledge. Good to have Databricks SQL Endpoint understanding. Good To have CI/CD experience to build the pipeline for Databricks jobs. Good to have if worked on migration project to build Unified data platform. Good to have knowledge of DBT. Good to have knowledge of docker and Kubernetes. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 2 months ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI . Inviting applications for the role of Lead Consultant- Databricks Developer ! In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements. Responsibilities Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have experience in Data Engineering domain . Qualifications we seek in you! Minimum qualifications Bachelor&rsquos Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience. Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have excellent coding skills either Python or Scala, preferably Python. Must have experience in Data Engineering domain . Must have implemented at least 2 project end-to-end in Databricks. Must have at least experience on databricks which consists of various components as below Delta lake dbConnect db API 2.0 Databricks workflows orchestration Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have good understanding to create complex data pipeline Must have good knowledge of Data structure & algorithms. Must be strong in SQL and sprak-sql . Must have strong performance optimization skills to improve efficiency and reduce cost . Must have worked on both Batch and streaming data pipeline . Must have extensive knowledge of Spark and Hive data processing framework. Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB /DynamoDB, ASB/SQS, Cloud databases. Must be strong in writing unit test case and integration test Must have strong communication skills and have worked on the team of size 5 plus Must have great attitude towards learning new skills and upskilling the existing skills. Preferred Qualifications Good to have Unity catalog and basic governance knowledge. Good to have Databricks SQL Endpoint understanding. Good To have CI/CD experience to build the pipeline for Databricks jobs. Good to have if worked on migration project to build Unified data platform. Good to have knowledge of DBT. Good to have knowledge of docker and Kubernetes. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 2 months ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant- Databricks Developer ! In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements. Responsibilities Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have experience in Data Engineering domain . Qualifications we seek in you! Minimum qualifications Bachelor&rsquos Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience. Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have excellent coding skills either Python or Scala, preferably Python. Must have experience in Data Engineering domain . Must have implemented at least 2 project end-to-end in Databricks. Must have at least experience on databricks which consists of various components as below Delta lake dbConnect db API 2.0 Databricks workflows orchestration Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have good understanding to create complex data pipeline Must have good knowledge of Data structure & algorithms. Must be strong in SQL and sprak-sql . Must have strong performance optimization skills to improve efficiency and reduce cost . Must have worked on both Batch and streaming data pipeline . Must have extensive knowledge of Spark and Hive data processing framework. Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB /DynamoDB, ASB/SQS, Cloud databases. Must be strong in writing unit test case and integration test Must have strong communication skills and have worked on the team of size 5 plus Must have great attitude towards learning new skills and upskilling the existing skills. Preferred Qualifications Good to have Unity catalog and basic governance knowledge. Good to have Databricks SQL Endpoint understanding. Good To have CI/CD experience to build the pipeline for Databricks jobs. Good to have if worked on migration project to build Unified data platform. Good to have knowledge of DBT. Good to have knowledge of docker and Kubernetes. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 2 months ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant- Databricks Developer ! In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements. Responsibilities Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have experience in Data Engineering domain . Qualifications we seek in you! Minimum qualifications Bachelor&rsquos Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience. Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have excellent coding skills either Python or Scala, preferably Python. Must have experience in Data Engineering domain . Must have implemented at least 2 project end-to-end in Databricks. Must have at least experience on databricks which consists of various components as below Delta lake dbConnect db API 2.0 Databricks workflows orchestration Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have good understanding to create complex data pipeline Must have good knowledge of Data structure & algorithms. Must be strong in SQL and sprak-sql. Must have strong performance optimization skills to improve efficiency and reduce cost. Must have worked on both Batch and streaming data pipeline. Must have extensive knowledge of Spark and Hive data processing framework. Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases. Must be strong in writing unit test case and integration test Must have strong communication skills and have worked on the team of size 5 plus Must have great attitude towards learning new skills and upskilling the existing skills. Preferred Qualifications Good to have Unity catalog and basic governance knowledge. Good to have Databricks SQL Endpoint understanding. Good To have CI/CD experience to build the pipeline for Databricks jobs. Good to have if worked on migration project to build Unified data platform. Good to have knowledge of DBT. Good to have knowledge of docker and Kubernetes. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities: Lead the implementation of end-to-end Generative AI projects utilizing Microsoft Azure AI and OpenAI. Design and build Generative AI applications using advanced methodologies such as Retrieval Augmented Generation (RAG). Automate various aspects of the overall RAG pipeline for Generative AI based applications. Design comprehensive solutions leveraging Generative AI, AI, and Machine Learning (ML) components. Gain hands-on experience working with Azure OpenAI services or other managed large language models (LLMs). Utilize AI/ML, search, and data services within the Azure ecosystem, including Azure OpenAI, AI Search, CosmosDB, and Azure Functions. Conduct thorough evaluation, validation, and refinement of results from Generative AI RAG-based applications. Set up robust monitoring, observability, and guardrails for Generative AI solutions to ensure responsible and ethical deployment. Develop and deploy applications using Python programming language, along with popular frameworks such as Flask and FastAPI. Apply expertise in AI/ML, deep learning, TensorFlow, and Natural Language Processing (NLP). Demonstrate excellent understanding of machine learning techniques and algorithms, including GPTs, CNN, RNN, k-NN, Naive Bayes, SVM, Decision Forests, etc. Deploy applications to Microsoft Azure or other cloud platforms.

Posted 2 months ago

Apply

3.0 - 7.0 years

18 - 20 Lacs

Pune

Work from Office

Strong design and architectural experience in building various highly-scalable and highly-available products,Strong understanding of the SDLC Activities which include Analysis, Design, Development, Testing, Deployment and Post-Production Support etc,Proficiency in at least one server side framework for languages preferably Go Lang Experience working on NoSQL & SQL Databases such as MySQL, PostgreSQL, MongoDB, Redis etc.,Deep Dive, problem-solving, RCA and systematic thinking to reach the cause of issues,Able to work independently and multi-task effectively. Program at a system level and able to manage service stability,Excellent experience maintaining, scalable, extensible code Methodical in maintaining up to date documentation,Metric-driven mindset and obsessive about ensuring clean coding practices,Preferred experience in product development,Preferred working experience on microservices platforms,Proficiency in at least one modern web front-end development framework such as React JS will be a bonus.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies