Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 7.0 years
6 - 9 Lacs
Noida
Work from Office
Job Description: Integration Developer/Backend Developer. Roles and Responsibilities: Develop and maintain backend services using Java, Spring Boot, and Spring Framework. Design and implement RESTful APIs and integrate with frontend applications. Work with MongoDB, MSSQL and other NoSQL/SQL databases. Optimize application performance and ensure security best practices. Use Postman for API testing and debugging. Collaborate with DevOps teams; basic knowledge of AWS services is a plus. Rest API, Java, Springboot, Spring MVC, Node JS, javascript, typescript , postman, GIT, Proficient in using IDE like Eclipse, AWS Services like S3, EC2, Lambda, IAM, API gateway, ECS Experience with MongoDB, MSSQL (schema design, performance tuning, indexing strategies). Familiarity with performance engineering, application monitoring, and optimization. Knowledge of AWS common services and DevOps practices is a plus. At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .
Posted 2 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Strong Experience with AWS: At least 2 years of AWS experience. Deep understanding of AWS services, including but not limited to Lambda, S3, DynamoDB, Step Functions, and IAM.Microservices Expertise: Proven track record of designing and implementing microservices architectures with RESTful APIs.Workflow Orchestration: Hands-on experience with workflow tools such as Netflix Conductor, AWS Step Functions, or equivalent orchestration frameworks.Programming Proficiency: Strong skills in back-end programming on JavaDatabase Management: Familiarity with relational and non-relational databases, including schema design and optimization.Problem Solving: Ability to troubleshoot complex issues, propose scalable solutions, and optimize workflows. QualificationsJava 8-11, Springboot, AWS, Microservices, REST API, workflow tools.Would be a plus AWS Services: Minimum 3 hands on experience: Lambda, S3, DynamoDB, Step Functions, SQS, SNS and IAM
Posted 2 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Noida
Work from Office
Job Description: Snowflake Data Engineer/Architect (Snowflake, DBT, Azure) Job Location: Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Job Details Technical Expertise: Strong proficiency in Snowflake architecture, including data sharing, partitioning, clustering, and materialized views. Advanced experience with DBT for data transformations and workflow management. Expertise in Azure services, including Azure Data Factory, Azure Data Lake, Azure Synapse, and Azure Functions. Data Engineering: Proficiency in SQL, Python, or other relevant programming languages. Strong understanding of data modeling concepts, including star schema and normalization. Hands-on experience with ETL/ELT pipelines and data integration tools. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and stakeholder management abilities. Ability to work in agile teams and handle multiple priorities. Preferred Qualifications: Certifications in Snowflake, DBT, or Azure Data Engineering. Familiar with data visualization tools like Power BI or Tableau. Knowledge of CI/CD pipelines and DevOps practices for data workflows. At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .
Posted 2 weeks ago
4.0 - 7.0 years
6 - 9 Lacs
Mumbai
Work from Office
Client is streamlining its operating platform to reduce complexity and scale the business. One major component to that is consolidating the public markets business onto a single front and middle office platform using BlackRock Aladdin. The program involves migrating existing front and middle office systems to the Aladdin F2B platform (i.e., Aladdin Enterprise (investment platform), Aladdin Accounting, Aladdin Data Cloud and eFront Insight). Roles and Responsibilities: Client is consolidating their Order Management Systems to a single third-party platform. Roughly ~9000 rules are being migrated over. Requesting support to assist with coding review, testing, rule corrections, etc., as well as ensure all compliance requirements continue to be met. Consultants will provide advisory and rule testing capabilities to support the OMS consolidation project. Consultant should have experience with Aladdin and rule/macro/token testing abilities. Activities Include: Planning and Data Model Design: - Document Project Plan, Coding Approach and Schedule, Rule test plan and Schedule - Agree an implementation workflow and workflow management tools - Design a three or four tier asset classification schema based on standard industry definitions, and Client specific asset class definitions. - Conduct workshops with compliance and risk (and other relevant parties where required) to agree asset class definitions and data model structure. - Collate and organize contractual and regulatory rules to be implemented into a master rule library (e.g. SharePoint list) - Categorize rules into functionality categories, e.g. credit rating, currency exposure, duration etc. Rules and Macro Coding: - Code compliance Macro BQL, as per agreed asset type/class definitions and security classification structure - Agree rule coding standards for each contractual rule category. (e.g. the definition of a country) - Code compliance rule BQL based on rule-type categories to generate efficiencies and consistency - Build and document Client Aladdin Macro and Token Libraries - Identify any custom rule types and/or calculations and liaise with BRS development team to agree compliance Token development for those requirements. - Set up a large Client portfolio group covering asset types required compliance macros - Run assets against the macros to ensure correct asset capture by each macro - Document Macro Test results Rule, Macro and Token Testing: - Testing rules based on rule-type categories to achieve efficiency - Testing rules according to standardized test procedures - Measuring and comparing test results against real portfolio holdings on a post-trade basis - Conducting standardized pre-trade test scenarios to ensure functionality - Documenting Testing Results, into standard test result templates - Standardized review of all rules coded and tested across categories defined - Review of all results and outputs from Rule Testing phase - Investigate and analyze Aladdin compliance violations. Reconcile violations against the equivalent violations in Sentinel. - Bed down production tasks and processes, and conduct training
Posted 2 weeks ago
4.0 - 8.0 years
4 - 8 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Title: Apache Solr Developer Location: Chennai, Bangalore, Pune, Hyderabad Experience: 4-8 Years Work Mode: Contract Duration: 6 Months Job Summary: We are looking for a skilled Apache Solr Engineer to design, implement, and maintain scalable and high-performance search solutions. The ideal candidate will have hands-on experience with Solr/SolrCloud, strong analytical skills, and the ability to work in cross-functional teams to deliver efficient search functionalities across enterprise or customer-facing applications. Key Responsibilities: Design, develop, and maintain enterprise-grade search solutions using Apache Solr and SolrCloud . Develop and optimize search indexes and schema based on use cases like product search, document search, or order/invoice search. Integrate Solr with backend systems, databases and APIs. Implement full-text search , faceted search , auto-suggestions , rank ing , and relevancy tuning . Optimize search performance, indexing throughput, and query response time. Ensure data consistency and high availability using SolrCloud and Zookeeper (cluster coordination & configuration management). Monitor search system health and troubleshoot issues in production. Collaborate with product teams, data engineers, and DevOps teams for smooth delivery. Stay up to date with new features of Apache Lucene/Solr and recommend improvements. Required Skills & Qualifications: Strong experience in Apache Solr & SolrCloud Good understanding of Lucene , inverted index , analyzers , tokenizers , and search relevance tuning . Proficient in Java or Python for backend integration and development. Experience with RESTful APIs , data pipelines, and real-time indexing. Familiarity with Zookeeper , Docker , Kubern etes (for SolrCloud deployments). Knowledge of JSON , XML , and schema design in Solr. Experience with log analysis , performance tuning , and monitoring tools like Prometheus/Grafana is a plus. Exposure to e-commerce or document management search use cases is an advantage.
Posted 2 weeks ago
3.0 - 4.0 years
4 - 8 Lacs
Pune
Work from Office
Responsibilities Design, develop, and maintain scalable full-stack applications using Java and React. Implement and consume RESTful web services. Apply object-oriented design principles, design patterns, and multi-threading best practices. Build secure, high-performance backend services using Java and Spring Boot. Write clean, maintainable, and testable code, including unit and integration tests (JUnit). Design and optimize relational databases (e. g. , MySQL, Snowflake) and work with NoSQL databases (e. g. , MongoDB). Integrate message brokers such as Apache Kafka or RabbitMQ. Deploy and scale applications using cloud platforms like AWS or Azure. Utilize CI/CD pipelines (e. g. , Jenkins) and containerization tools (Docker, Kubernetes). Work within Agile/Scrum teams and tools (JIRA, Confluence). Collaborate with teams to integrate data visualization tools like Power BI or MicroStrategy when required. Requirements BE in CS / IT, MSc CS, MCS, MCA Science. 3-4 years of hands-on experience in full-stack development with Java (backend) and React. js (frontend). Strong knowledge of Spring Boot, Spring MVC, Spring Security, and Spring Data. Proficiency in ReactJS and Redux with experience in building responsive web UIs using HTML5, CSS3, and JavaScript. Experience with frontend build tools and bundlers like Webpack. Proven experience in developing and integrating RESTful APIs. Experience with databases such as MySQL, PostgreSQL, or MongoDB. Familiarity with responsive design frameworks and cross-browser compatibility. Proficient in Git with experience using GitHub, GitLab, or similar repositories. Understanding of database schema design and query optimization. Excellent communication and collaboration skills.
Posted 2 weeks ago
2.0 - 3.0 years
14 - 16 Lacs
Chennai
Work from Office
The opportunity: As a Data Engineer, you will be part of Operation Center, India (INOPC-PG), aiming to develop a global value chain, where key business activities, resources, and expertise are shared across geographic boundaries to optimize value for Hitachi Energy customers across markets. As part of Transformers BU, we provide high-quality engineering and Technology to Hitachi Energy world. This is an important step from Hitachi Energys Global Footprint strategy. How you ll make an impact: Display technical expertise in data analytics focusing on a team of diversified technical competencies. Build and maintain accurate and scalable data pipeline and infrastructure such as SQL Warehouse, Data Lakes, etc. using Cloud platforms (e. g. : MS Azure, Databricks). Proactively work with business stakeholders to understand data lineage, definitions, and methods of data extraction. Write production-grade SQL and PySpark code to create data architecture. Consolidate SQL databases from multiple sources, data cleaning, and manipulation in preparation for analytics and machine learning. Use data visualization tools such as Power BI to create professional quality dashboards and reports. Write good quality documentation for data processing for different projects to ensure reproducibility. Responsible to ensure compliance with applicable external and internal regulations, procedures, and guidelines. Living Hitachi Energy s core values safety and integrity, which means taking responsibility for your own actions while caring for your colleagues and the business. Your Background: BE / B. Tech in Computer Science, Data Science, or related discipline and at least 5 years of related working experience. 5 years of data engineering experience, with understanding lake house architecture, data integration framework, ETL/ELT pipeline, orchestration/monitoring, star schema data modeling. 5 years of experience with Python/PySpark and SQL. ( Proficient in PySpark, Python, and Spark SQL). 2-3 years of hands-on data engineering experience using Databricks as the main tool (meaning > 60% of their time is using Databricks instead of just occasionally). 2-3 years of hands-on experience with different Databricks components (DLT, workflow, Unity catalog, SQL warehouse, CI/CD) in addition to using notebooks. Experience in Microsoft Power BI. Proficiency in both spoken & written English language is required. Qualified individuals with a disability may request a reasonable accommodation if you are unable or limited in your ability to use or access the Hitachi Energy career site as a result of your disability. You may request reasonable accommodations by completing a general inquiry form on our website. Please include your contact information and specific details about your required accommodation to support you during the job application process. .
Posted 2 weeks ago
8.0 - 12.0 years
14 - 18 Lacs
Pune
Work from Office
Job Description Requirements: knowledgeable and experienced with Microsoft Fabric. Design and implement end-to-end data solutions on Microsoft Azure, including data lakes, data warehouses, and ETL/ELT processes. Develop scalable and efficient data architectures that support large-scale data processing and analytics workloads. Ensure high performance, security, and compliance within Azure data solutions. Know various techniques (lakehouse, warehouse) and have experience implementing them. Evaluate and choose appropriate Azure services such as Azure SQL Database, Azure Synapse Analytics, Azure Data Lake Storage, Azure Databricks (configuring, costing, etc), Unity Catalog, and Azure Data Factory. Should have deep knowledge and hands-on experience with these Azure Data Services. Work closely with business and technical teams to understand and translate data needs into robust, scalable data architecture solutions. Experience with data governance, data privacy, and compliance requirements. Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams. Provide expertise and leadership to the development team implementing data engineering solutions. Collaborate with Data Scientists, Analysts, and other stakeholders to ensure data architectures align with business goals and data analysis requirements. Optimize cloud-based data infrastructure for performance, cost-effectiveness, and scalability. Analyze data workloads and recommend optimizations for performance tuning, cost management, and reducing complexity. Monitor and address any issues related to performance and availability in cloud-based data solutions. Experience in programming languages (e.g., SQL, Python, Scala). Hands-on experience using MS SQL Server, Oracle, or similar RDBMS platform. Experience in Azure DevOps, CI/CD pipeline development Hands-on experience working at a high level in architecture, data science, or combination. In-depth understanding of database structure principles Distributed Data Processing of big data batch or streaming pipelines. Familiarity with data visualization tools (e.g., Power BI, Tableau, etc.) Data Modeling and strong analytics skills. The candidate must be able to take OLTP data structures and convert them into Star Schema. Ideally, the candidate should have DBT experience along with data modeling experience. Problem-solving attitude, Highly self motivated, self directed, and attentive to detail, Ability to prioritize and execute tasks effectively. Attitude and aptitude are highly important at Hitachi; we are a very collaborative group. We would like to see a blend of the following skills. Not all of these are required, however, Databricks and Spark are highly desirable: Azure SQL Data Warehouse Azure Data Factory Azure Data Lake Azure Analysis Services Databricks/Spark Python or Scala (Python preferred) Data Modeling Power BI Database migration from legacy systems to new solutions Design conceptual, logical and physical data models using tools like ER Studio, Erwin Qualifications EXP: 8-12 Years
Posted 2 weeks ago
8.0 - 12.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Job Description: 8-12 Years experience in .Net Technologies Hands-on service design, schema design and application integration design Hands-on software development using C#, .Net Core Use of multiple Cloud native database platforms including DynamoDB, SQL, Elasticache, and others Hands-on application design for high availability and resiliency Hands-on problem resolution across a multi-vendor ecosystem Conduct Code reviews and peer reviews Unit test and Unit test automation, defect resolution and software optimization Actively engaged with Client IT and Client Business during daily work sessions Code deployment using CI/CD processes Contribute to each step of the development process from ideation to implementation to release, including rapidly prototyping, running A/B tests, continuous Integration, Automated Testing and Continuous Delivery Understand business requirements and technical limitations Ability to learn new technologies and influence the team and leadership to constantly implement modern solutions Experience in using Elasticsearch, Logstash, Kibana (ELK) stack for Logging and Analytics Experience in container orchestration using Kubernetes Knowledge and Experience working with public cloud AWS services Knowledge of Cloud Architecture and Design Patterns Ability to prepare documentation for Microservices Monitoring tools such as Datadog, Logstash Excellent Communication skills Airline industry knowledge is preferred but not required At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .
Posted 2 weeks ago
8.0 - 12.0 years
11 - 15 Lacs
Chennai
Work from Office
Job Description: 8-12 Years experience in .Net Technologies Hands-on service design, schema design and application integration design Hands-on software development using C#, .Net Core Use of multiple Cloud native database platforms including DynamoDB, SQL, Elasticache, and others Hands-on application design for high availability and resiliency Hands-on problem resolution across a multi-vendor ecosystem Conduct Code reviews and peer reviews Unit test and Unit test automation, defect resolution and software optimization Actively engaged with Client IT and Client Business during daily work sessions Code deployment using CI/CD processes Contribute to each step of the development process from ideation to implementation to release, including rapidly prototyping, running A/B tests, continuous Integration, Automated Testing and Continuous Delivery Understand business requirements and technical limitations Ability to learn new technologies and influence the team and leadership to constantly implement modern solutions Experience in using Elasticsearch, Logstash, Kibana (ELK) stack for Logging and Analytics Experience in container orchestration using Kubernetes Knowledge and Experience working with public cloud AWS services Knowledge of Cloud Architecture and Design Patterns Ability to prepare documentation for Microservices Monitoring tools such as Datadog, Logstash Excellent Communication skills Airline industry knowledge is preferred but not required At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We re committed to fostering an inclusive environment where everyone can thrive. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .
Posted 2 weeks ago
8.0 - 12.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Title:Oracle & MongoDB DBAExperience8-12 YearsLocation:Bangalore : Technical Skills: Hands-on expertise in data migration between databases on-prem to AWS cloud RDS. Experience in usage of AWS Data Migration Service (DMS) Experience in export/import of huge size Database Schema, full load plus CDC. Should have knowledge & experience in Unix command and writing task automation shell scripts. Knowledge & experience on Different Backup/restore methods and backup tools is needed. Experienced with hands on experience in DBA Skills of database monitoring, performance tuning, DB refresh. Hands on experience in Database support. Hands-on expertise in data migration between databases on-prem to Mongo Atlas. Experience in creating clusters, databases and creating the users. Good command on DB Query language and its architecture. Experience in Conversion of Schema from one DB to Other is added advantage. Experience in Database and server consolidation. Strong hands-on experience in building logical data models, Data Quality, Data security, Understand Application lifecycle and build Service continuity documents. Responsible for building knowledge base - Run books, Cheat sheet, DR Drill books, Escalation procedure. Database refreshes from Production to Acceptance/Development environment Co-ordinate with the infrastructure/application team to get the required information. Evidence gathering for Audits. Non-Technical Skills: Candidate needs to be Good Team Player Ownership skills- should be an individual performer to take deliverables and handle fresh challenges. Service/Customer orientation/ strong oral and written communication skills are mandate. Should be confident and capable to speak with client and onsite teams. Effective interpersonal, team building and communication skills. Ability to collaborate; be able to communicate clearly and concisely both to laypeople and peers, be able to follow instructions, make a team stronger for your presence and not weaker. Should be ready to work in rotating shifts (morning, general and afternoon shifts) Ability to see the bigger picture and differing perspectives; to compromise, to balance competing priorities, and to prioritize the user. Desire for continuous improvement, of the worthy sort; always be learning and seeking improvement, avoid change aversion and excessive conservatism, equally avoid harmful perfectionism, "not-invented-here" syndrome and damaging pursuit of the bleeding edge for its own sake. Learn things quickly, while working outside the area of expertise. Analyze a problem and realize exactly what all will be affected by even the smallest of change you make in the database. Ability to communicate complex technology to no tech audience in simple and precise manner.
Posted 2 weeks ago
8.0 - 13.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Title Oracle & MongoDB DBAExperience 8-16 YearsLocation Bangalore : Must have 4 Year degree (Computer Science, Information Systems or equivalent) 8+ years overall IT experience (5+ Years as DBA) Technical Skills: Hands-on expertise in data migration between databases on-prem to AWS cloud RDS. Experience in usage of AWS Data Migration Service (DMS) Experience in export/import of huge size Database Schema, full load plus CDC. Should have knowledge & experience in Unix command and writing task automation shell scripts. Knowledge & experience on Different Backup/restore methods and backup tools is needed. Experienced with hands on experience in DBA Skills of database monitoring, performance tuning, DB refresh. Hands on experience in Database support. Hands-on expertise in data migration between databases on-prem to Mongo Atlas. Experience in creating clusters, databases and creating the users. Good command on DB Query language and its architecture. Experience in Conversion of Schema from one DB to Other is added advantage. Experience in Database and server consolidation. Strong hands-on experience in building logical data models, Data Quality, Data security, Understand Application lifecycle and build Service continuity documents. Responsible for building knowledge base - Run books, Cheat sheet, DR Drill books, Escalation procedure. Database refreshes from Production to Acceptance/Development environment Co-ordinate with the infrastructure/application team to get the required information. Evidence gathering for Audits. Non-Technical Skills: Candidate needs to be Good Team Player Ownership skills- should be an individual performer to take deliverables and handle fresh challenges. Service/Customer orientation/ strong oral and written communication skills are mandate. Should be confident and capable to speak with client and onsite teams. Effective interpersonal, team building and communication skills. Ability to collaborate; be able to communicate clearly and concisely both to laypeople and peers, be able to follow instructions, make a team stronger for your presence and not weaker. Should be ready to work in rotating shifts (morning, general and afternoon shifts) Ability to see the bigger picture and differing perspectives; to compromise, to balance competing priorities, and to prioritize the user. Desire for continuous improvement, of the worthy sort; always be learning and seeking improvement, avoid change aversion and excessive conservatism, equally avoid harmful perfectionism, 'not-invented-here' syndrome and damaging pursuit of the bleeding edge for its own sake. Learn things quickly, while working outside the area of expertise. Analyze a problem and realize exactly what all will be affected by even the smallest of change you make in the database. Ability to communicate complex technology to no tech audience in simple and precise manner. Skills PRIMARY COMPETENCY Data Engineering PRIMARY Oracle APPS DBA PRIMARY PERCENTAGE 75 SECONDARY COMPETENCY Data Engineering SECONDARY MongoDB APPS DBA SECONDARY PERCENTAGE 25
Posted 2 weeks ago
4.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Title Oracle & MongoDB DBAExperience 4-8 YearsLocation Bangalore : Must have 4 Year degree (Computer Science, Information Systems or equivalent) 8+ years overall IT experience (5+ Years as DBA) Technical Skills: Hands-on expertise in data migration between databases on-prem to AWS cloud RDS. Experience in usage of AWS Data Migration Service (DMS) Experience in export/import of huge size Database Schema, full load plus CDC. Should have knowledge & experience in Unix command and writing task automation shell scripts. Knowledge & experience on Different Backup/restore methods and backup tools is needed. Experienced with hands on experience in DBA Skills of database monitoring, performance tuning, DB refresh. Hands on experience in Database support. Hands-on expertise in data migration between databases on-prem to Mongo Atlas. Experience in creating clusters, databases and creating the users. Good command on DB Query language and its architecture. Experience in Conversion of Schema from one DB to Other is added advantage. Experience in Database and server consolidation. Strong hands-on experience in building logical data models, Data Quality, Data security, Understand Application lifecycle and build Service continuity documents. Responsible for building knowledge base - Run books, Cheat sheet, DR Drill books, Escalation procedure. Database refreshes from Production to Acceptance/Development environment Co-ordinate with the infrastructure/application team to get the required information. Evidence gathering for Audits. Non-Technical Skills : Candidate needs to be Good Team Player Ownership skills- should be an individual performer to take deliverables and handle fresh challenges. Service/Customer orientation/ strong oral and written communication skills are mandate. Should be confident and capable to speak with client and onsite teams. Effective interpersonal, team building and communication skills. Ability to collaborate; be able to communicate clearly and concisely both to laypeople and peers, be able to follow instructions, make a team stronger for your presence and not weaker. Should be ready to work in rotating shifts (morning, general and afternoon shifts) Ability to see the bigger picture and differing perspectives; to compromise, to balance competing priorities, and to prioritize the user. Desire for continuous improvement, of the worthy sort; always be learning and seeking improvement, avoid change aversion and excessive conservatism, equally avoid harmful perfectionism, "not-invented-here" syndrome and damaging pursuit of the bleeding edge for its own sake. Learn things quickly, while working outside the area of expertise. Analyze a problem and realize exactly what all will be affected by even the smallest of change you make in the database. Ability to communicate complex technology to no tech audience in simple and precise manner.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
The Opportunity: We are seeking a highly skilled Database Developer & Administrator to join our team. The ideal candidate will possess extensive experience in database management, including backup management, query optimization, and performance tuning. This role will focus on both Relational and NoSQL databases, with strong emphasis on MySQL. Required Skills & Experience: Accountability As the Database Administrator, you will be responsible for ensuring high availability, reliability, and security of MySQL databases. Backup Management through safeguarding data through effective backup and recovery strategies. Improving query performance for faster data retrieval & sustaining optimal performance of the database environment. Creating scalable and normalized database structures aligned with business needs. seamless operation of databases in cloud environments & minimizing database downtime and resolving issues swiftly. Scope Act as the subject matter expert (SME) for MySQL database systems. Collaborate with software developers, system administrators, and DevOps teams to align database solutions with business goals. Document database architectures, procedures, and change logs. Ensure database systems comply with organizational security policies and data privacy regulations. Contribute to capacity planning and database scaling strategies. Stay updated with MySQL advancements, patches, and best practices. Outcomes Managing and maintaining the MySQL database infrastructure by monitoring database systems and proactively addressing issues to prevent downtime. Design and implement backup strategies to safeguard data and test backup and recovery plans to ensure business continuity in case of failures. Work closely with developers to improve query efficiency and reduce load on servers. Tune system configurations, indexes, and queries to ensure high-speed transactions and minimal latency. Collaborate with development teams to design scalable and efficient data storage structures along with creation and update of database schemas, ensuring normalization and referential integrity.
Posted 2 weeks ago
4.0 - 6.0 years
2 - 6 Lacs
Pune
Work from Office
MicroStrategy Reporting Engineer Key Responsibilities Design, develop and maintain MicroStrategy projects across environments Act as the customer-facing contact point between data engineering and the business, assisting analysts and users with how to use, navigate, and develop their own reports and dashboards on the MicroStrategy platform Build and maintain MicroStrategy as primary BI tool and be the driving force behind the adoption and effective use of MicroStrategy within every team Investigate, improve and optimize object creation for improved user experience. Influence internal and external stakeholders to design and adopt processes that elevate data integrity and facilitate self-help analytics and proper data governance Build and lead impactful and KPI-centric relationships with cross-functional team members Help create and maintain development standards (style guides, naming, etc.) and assist on query reviews for other BI developers or analysts Basic Qualifications Bachelor's degree in a technical field or equivalent technical knowledge and experience. At least 3+ years of experience using MicroStrategy (Developer tool for semantic model creation) Experience in creating, maintaining, and debugging MicroStrategy schema and application objects. Experience designing MicroStrategy reports, dossiers and documents Experience in creating, tuning, and maintaining MicroStrategy cubes Excellent SQL skills, with the ability to write and debug complex queries and perform query tuning High energy and action-oriented with a history of getting things done in complex, fast-moving environments Ability to communicate complex information clearly and concisely. Experience working on projects from feature definition to project deployment through the development lifecycle. Experience informing and assisting peers and other business stakeholders on various business initiatives. Preferred Qualifications Understanding of development processes and agile methodologies. Experience within retail, eCommerce is desired Experience with cloud data warehouses, particularly Snowflake is desired Effective analytical, troubleshooting, and problem-solving skills. Experience building data products incrementally and integrating and managing datasets from multiple sources. Experience with MicroStrategy administration tools such as Object Manager and Integrity Manager
Posted 2 weeks ago
10.0 - 15.0 years
12 - 16 Lacs
Bengaluru, Bangalaore
Work from Office
Job Title Technical Architect Experience Level: 10+ years : The Appian Architect is responsible for leading the design and implementation of enterprise-wide Appian solutions. This role requires a deep understanding of the Appian platform, including its core capabilities, data fabric, Appian AI, RPA etc. The architect will work closely with key business stakeholders, IT teams, and Appian developers to ensure that the Appian implementations align with business goals and IT standards, enhancing operational efficiencies and delivering exceptional value. Roles and Responsibilities: Strategic Planning and Consultation: o Serve as the primary Appian strategy advisor to business and IT leadership.o Assess business requirements and translate them into effective Appian solutions.o Lead architectural discussions, influencing decisions regarding Appian implementations.o Evangelize the usage of reusable frameworks and artifacts, create knowledge/certification artifacts and evaluation criteria guide. Design and Implementation o Design scalable and sustainable Appian architectures, including integration with other enterprise systems. o Oversee the development and customization of Appian applications using Appian designer and other development tools.o Experience of performance compliant design, sustainable and solution architecture.o Leverage modern technologies such as cloud capabilities from various platforms to build efficient solutions o Implements features using native Appians out of the box capabilities and plugins, third party components. Governance and Best Practices o Develop and enforce Appian best practices and governance frameworks.o Ensure solutions are built for performance, reliability, and scalability.o Manage the Appian platform upgrade process, ensuring compatibility and minimal disruption. Collaboration and Leadership: o Lead cross-functional teams in the design, development, and deployment of Appian solutions.o Facilitate collaboration between stakeholders, developers, and IT operations teams.o Mentor and develop team members, enhancing their Appian capabilities. Continuous Improvement: o Stay abreast of Appian product updates, industry trends, and emerging technologies.o Recommend and implement improvements to existing Appian solutions.o Drive innovation by exploring new Appian modules and capabilities like Appian AI (Email Classification, Document Classification and Extraction, Prompt Builder), GenAI Capabilities via Plugins. Skills and Qualifications: Technical Expertise o Extensive experience with Appians core platform and development tools.o Proficiency in integration technologies (REST, SOAP, JWT).o Knowledge on Cloud platforms like AWS, Azure services and integrations is an added advantage.o Proven experience with key technologies relevent to the Appian.o integration solution includingSSO, SAML, SSL, LDAP, JDBC, ODBC, REST etc.o Excellent knowledge of Enterprise Security and Architecture, middleware and discovery technologies, database design schemas and data modeling.o Excellent problem-solving and decision-making skills. o Excellent communication and stakeholder management skills. Architectural Acumen: o Strong ability to design scalable, high-performing Appian architectures.o Experience with Appian application customization and configuration. Experience & Educational Background: o A bachelor's or master's degree in computer science, Information Technology, or related field.o Required certificationsAppian Senior or Lead Developer Certification.o At least 5+ years of experience in designing, developing and architecting via Appian platform. Must have played architect role in end-to-end execution of 3-4 Appian projects.o Exposure to scalable design patterns using Java, J2EE, Micro services-based architecture. Other Preferred Skills o Previous leadership role in an IT focused consulting services companyo Project Management experienceo Strong understanding of User Experience (UX) concepts as it relates to applications.o Certified in agile framework and associated scrum methodology.o Low Code / No Code Development experience in other technologies like, Mendix, Out Systems etc. Qualifications Any Graduate Job Location
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Gurugram
Work from Office
WNS is hiring CAT Modeling professionals for a global reinsurance client across below mentioned skill-mix. Kindly refer to the job description mentioned against the desired skill-mix. 1. Portfolio Modeling (3+ years experience in end-to-end portfolio rollups) 2. Regulatory Reporting (4+ years experience in EDM/RDM/SCHEMA) 3. Model Validation (4+ years experience in Model Validation with tools like RMS/AIR) 4. Technical Solution (3+ years experience in SQL Query writing for CAT Modeling function) S: 1. Portfolio Modeling Good understanding Cat Modeling process and workflows Run vendor catastrophe modeling platforms (primarily RMS, AIR, Elements) for insureds and perform portfolio risk analyses. Working knowledge of RMS model scope across worldwide peril-regions regarding sub-perils, amplification, etc. along with basic understanding of cat-modelling four-box principle concerning exposure, hazard, vulnerability modules and translation of insurance and (re)insurance financial terms through coding in RMS and SQL. Assist clients in the understanding of catastrophe risk of individual insured through analytics based on catastrophe model results. Provide analytical support to catastrophe modeling team operations by sharing knowledge and information Develop processes and scripts for process improvements Provide timely and frequent feedback to team members. Preparing MIS reports Training and mentoring of team members inducted in the pricing process. Assisting in monthly post bind and portfolio rollup activities. Ensure all SLAs are met Communication with onshore SPOCs at regular intervals. 2. Regulatory Reporting Role and Responsibilities Good understanding Cat Modeling process and workflows. Run vendor catastrophe modeling platforms (primarily RMS, AIR, Elements) including accumulation analysis for reporting needs, whenever required Thorough knowledge of RMS EDM-RDM schema Ability to understand the requirements of regulatory submissions and further deliver them accordingly Understanding of RDS scenarios of Lloyds including Non-Modelled scenarios as well Working knowledge regulatory reports like LCM, RDS Scenarios, Terror Accumulations & reporting. Working knowledge of any other regulatory reports. Working knowledge of RMS model scope across worldwide peril-regions regarding sub-perils, amplification, etc. along with basic understanding of cat-modelling four-box principle concerning exposure, hazard, vulnerability modules and translation of insurance and (re)insurance financial terms. Provide analytical support to catastrophe modeling team operations by sharing knowledge and information Develop processes and scripts for process improvements Assisting in portfolio rollup activities. Ensure all SLAs are met Communication with onshore SPOCs at regular intervals. 3. Model Validation: : Perform model validation and provide recommendations on model use and/or required adjustments. Work with internal teams and external data providers on analysis, utilising available data including scientific information, claims and insured exposure Contribute to and lead Group projects as required, liaising with other teams globally. Produce customised reports on exposure and modelled results. Evaluate re/insurance pricing for individual accounts and product classes. Analyse catastrophe reinsurance structures and strategies to support reinsurance placements. Assist with the analysis of real time events and identify learnings from post-event reviews. Strong analytical and numerical ability, in order to interrogate large datasets Experience of working with re/insurance catastrophe data and/or catastrophe modelling software Excellent written and verbal communication skills, and the ability to explain technical concepts clearly Intermediate/Advanced Excel skills Pro-active attitude to identifying inefficient processes and developing improvementsDesirable Requirements Knowledge of commercial insurance and/or the catastrophe modelling industry Sound working knowledge of RMS/AIR and any other vendor modelling platforms Coding experience in a relevant language (e.g. SQL, VBA, R, C#) Experience in using mapping software (e.g. GIS) 4. Technical Solution (SQL query): Catastrophe Modelling Analyst in the Accumulation Management department, working with the Technical Solutions team The Technical Solutions team is focused on developing customized in-house tools and databases for the Accumulation Management team, to streamline processes and organize data in an efficient manner Technical role with large potential for growth in responsibilities Develop an understanding of existing Catastrophe Modelling processes, licensed software, and the various in-house tools used to automate processes Maintain existing Accumulation Management tools. Debug errors in the code when users experience issues Assist users with technical questions. Explain how tools work and deliver training sessions when required Test new functionality prior to launch to ensure that tools are working as intended Support team to design and develop new tools to automate processes. Update user guides when needed Work with colleagues around the globe on ad-hoc projects Qualifications Bachelors Degree in Mathematics/ Applied Mathematics/ Statistics/ Operations Research/ Actuarial Science Job Location
Posted 2 weeks ago
5.0 - 10.0 years
8 - 10 Lacs
Jaipur
Work from Office
We are looking for a highly skilled and experienced Ecommerce SEO Specialist to join our team and drive organic traffic, improve search engine rankings, and optimise our ecommerce website ShopLC. com. The ideal candidate will have a deep understanding of ecommerce platforms, technical SEO principles, on-page optimisation, website architecture, site speed optimisation, structured data, XML sitemaps, mobile optimisation, link building analytics. You will collaborate with cross-functional teams to implement SEO best practices and ensure the strong organic presence of ShopLC across all the major search engines in the US. You will have a pivotal role in one of the business most exciting and fast-growing departments, assisting a digital transformation that will has the potential for a profound effect on the entire business. Role and responsibilities: Conduct comprehensive technical SEO audits to identify and resolve website issues affecting search engine visibility, crawlability, and user experience. Develop and execute comprehensive on-page and off-page SEO strategies for e-commerce, including optimizing category pages, improving internal linking, and building relevant external backlinks. Implement and optimise XML sitemaps, robots. txt files, and canonical tags to guide search engine crawlers and manage duplicate content issues. Identify and fix website errors, broken links, and 404 pages to improve user experience and search engine performance. Conduct keyword research and analysis specific to e-commerce products to identify high-value keywords and phrases. Optimize product pages for search engines, including meta tags, product descriptions, titles, and URLs, to improve organic search rankings and click-through rates. Implement and optimise structured data markup (schema. org) to enhance search engine visibility and improve rich snippets in search results. Ensure mobile-friendliness and responsiveness of the website. Monitor website performance using SEO tools and platforms (e. g. , Google Analytics, Google Search Console) and provide actionable insights and recommendations. Track, analyse, and report on key technical SEO metrics, such as website speed, crawl errors, indexation status, and mobile usability. Conduct keyword research and analysis to uncover SEO opportunities and improve website relevancy. Stay informed about emerging technologies and trends that may impact technical SEO, such as voice search and mobile-first indexing. Stay updated with the latest SEO trends, industry developments, and search engine algorithm updates. Core skills will include: Constantly reviewing success and look for improvements. A great problem-solving mentality that s able to overcome obstacles and find solutions. An ambitious, energetic self-starter. Up for a challenge and ready to deal with the fast-paced, ever-changing nature of an Ecommerce business. Takes responsibility and initiative for actions, projects and people. Qualifications and Experience: Bachelors degree in marketing, computer science, or a related field. Minimum of 5 years of experience as a SEO Expert, with a proven track record of optimising websites for search engine performance. Ecommerce Exposure is must. Extensive experience with technical website optimisation techniques, including website structure, internal linking, XML sitemaps, robots. txt, and canonical tags. Proficiency in using SEO tools and platforms, such as GA4, Google Search Console, Screaming Frog, SEMRUSH and other website auditing tools. Strong knowledge of SEO best practices, search engine algorithms, and ranking factors. Expertise in website speed optimisation techniques, performance analysis tools. Solid understanding of HTML, CSS, and JavaScript as they relate to technical SEO. Proficiency in mobile optimisation best practices and mobile-friendly website design. Familiarity with structured data markup (schema. org) and its implementation for enhanced search results. Up-to-date with the latest SEO trends, industry developments, and search engine algorithm updates. Excellent analytical skills with the ability to interpret data and make data-driven decisions. Strong communication and collaboration skills to work effectively with cross-functional teams.
Posted 2 weeks ago
2.0 - 6.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Develop, test, and maintain robust backend services using Python and Django. Gen Ai and Api integration Experience Design and optimize database schemas using PostgreSQL to support business processes. Build and maintain RESTful APIs for seamless communication between backend and frontend applications. Collaborate with frontend developers to integrate Angular-based UI with backend services. Work with Celery and Redis for task queues and asynchronous processing (preferred). Ensure code quality, security, and performance optimization. Troubleshoot and debug issues to enhance application reliability. Stay updated with the latest industry trends and technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Python and Django Web Framework. Experience working with Angular (good to have). Solid knowledge of PostgreSQL and database schema design. Familiarity with RESTful API development. Working experience with Celery and Redis is a plus. Understanding of Open edX and Learning Management Systems (LMS) is an added advantage. Knowledge of JavaScript/TypeScript for frontend collaboration. Strong problem-solving skills and ability to work in an agile environment. Preferred technical and professional experience Bachelor’s/Master’s degree in Computer Science, Engineering, or a related field. Experience working in EdTech or similar industries. Familiarity with cloud platforms (AWS, Azure, or GCP) is a bonus.
Posted 2 weeks ago
3.0 - 7.0 years
6 - 11 Lacs
Bengaluru
Work from Office
Develop, test, and maintain robust backend services using Python and Django. Hand on Experience around Gen Ai Application Design and optimize database schemas using PostgreSQL to support business processes. Build and maintain RESTful APIs for seamless communication between backend and frontend applications. Collaborate with frontend developers to integrate Angular-based UI with backend services. Work with Celery and Redis for task queues and asynchronous processing (preferred). Ensure code quality, security, and performance optimization. Troubleshoot and debug issues to enhance application reliability. Stay updated with the latest industry trends and technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Required Professional and Technical Expertise Strong proficiency in Python and Django Web Framework. Experience working with Angular (good to have). Solid knowledge of PostgreSQL and database schema design. Familiarity with RESTful API development. Working experience with Celery and Redis is a plus. Understanding of Open edX and Learning Management Systems (LMS) is an added advantage. Knowledge of JavaScript/TypeScript for frontend collaboration. Strong problem-solving skills and ability to work in an agile environment. Preferred technical and professional experience Preferred Professional and Technical Expertise Bachelor’s/Master’s degree in Computer Science, Engineering, or a related field. Experience working in EdTech or similar industries. Familiarity with cloud platforms (AWS, Azure, or GCP) is a bonus.
Posted 2 weeks ago
5.0 - 7.0 years
0 - 0 Lacs
bangalore
On-site
Adobe Campaign 4+ Bangalore Adobe Campaign Manager 8.6.2, Java Script, HTML, CSS, Json 1. Good communication skills. Able to communicate with business stakeholders and translate requirements. 2. Working experience on creating the schemas, input forms, Java script functions in Adobe Campaign Classic3. Has strong knowledge on integration/platform setups. 4. Working experience on creation of the technical workflows to consume the data from upstream or send the data to the downstream systems. 5. Experience on understanding and implementation of the integration of Adobe Campaign Classic with different systems, channels setup enablement 6. Working experience on building the Workflow templates/ delivery templates based on the Business use cases and producing solution. 7. Trouble shooting knowledge on the campaign workflow execution errors and monitoring system health check-ups. 8. Working experience on creation of the campaigns with Email/SMS/Push Notification channels based on different use cases. 9. Hands on experience in deploying the packages from one instance to the other instance.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Chennai
Work from Office
Hands-on experience in data modelling for both OLTP and OLAP systems. In-depth knowledge of Conceptual, Logical, and Physical data modelling. Strong understanding of indexing, partitioning, and data sharding with practical experience. Experience in identifying and addressing factors affecting database performance for near-real-time reporting and application interaction. Proficiency with at least one data modelling tool (preferably DB Schema). Functional knowledge of the mutual fund industry is a plus. Familiarity with GCP databases like Alloy DB, Cloud SQL, and Big Query. Willingness to work from Chennai (office presence is mandatory) Chennai customer site, requiring five days of on-site work each week. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform Experience: 5-8 Years
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
The company, WNS (Holdings) Limited, a leading Business Process Management (BPM) company, collaborates with clients across various industries to create digital-led transformational solutions. WNS empowers businesses in multiple sectors such as Travel, Insurance, Banking and Financial Services, Manufacturing, Retail, Shipping and Logistics, Healthcare, and Utilities to envision their digital future and enhance outcomes through operational excellence. With a workforce of over 44,000 employees, WNS provides a wide range of BPM services in finance and accounting, procurement, customer interaction services, and human resources, tailored to address the unique challenges of each client. WNS is currently seeking CAT Modeling professionals for a global reinsurance client with expertise in Portfolio Modeling, Regulatory Reporting, Model Validation, and Technical Solution. The ideal candidates should possess specific experience levels in each of these areas, as detailed below: 1. **Portfolio Modeling**: - Understand the Cat Modeling process and workflows - Utilize vendor catastrophe modeling platforms (RMS, AIR, Elements) for insureds and conduct portfolio risk analyses - Demonstrate knowledge of RMS model scope and cat-modelling principles - Assist clients in understanding catastrophe risk through analytics - Provide analytical support, develop processes, and improve team operations - Ensure SLAs are met and communicate with onshore SPOCs regularly 2. **Regulatory Reporting**: - Understand Cat Modeling process and workflows - Run catastrophe modeling platforms for accumulation analysis and regulatory requirements - Thorough knowledge of RMS EDM-RDM schema and regulatory reports - Provide analytical support, develop processes, and assist in portfolio rollup activities - Ensure SLAs are met and maintain communication with onshore SPOCs 3. **Model Validation**: - Perform model validation, provide recommendations, and work with internal/external teams - Contribute to Group projects, produce customized reports, and analyze reinsurance structures - Evaluate pricing, analyze real-time events, and demonstrate strong analytical abilities - Excellent written and verbal communication skills, and proficiency in Excel - Desirable: Knowledge of commercial insurance, catastrophe modeling industry, and coding experience 4. **Technical Solution (SQL query)**: - Work as a Catastrophe Modelling Analyst in the Accumulation Management department - Collaborate with the Technical Solutions team to develop customized tools and databases - Maintain existing tools, assist users, test new functionality, and support team projects - Bachelors Degree in Mathematics/ Applied Mathematics/ Statistics/ Operations Research/ Actuarial Science In summary, WNS is looking for CAT Modeling professionals with specific experience levels in Portfolio Modeling, Regulatory Reporting, Model Validation, and Technical Solution to join their global reinsurance client team. Successful candidates will contribute to various aspects of catastrophe risk analysis, regulatory reporting, model validation, and technical solutions, while ensuring operational excellence and effective communication with stakeholders.,
Posted 2 weeks ago
2.0 - 6.0 years
8 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
":" This is a remote position. Job Description Gather, analyze, and document business and data requirements related to Guidewire implementation and enhancements. Collaborate with stakeholders (business users, technical teams, QA, etc.) to ensure proper understanding of data needs and processes. Work with Guidewiredata model and schema to extract relevant data for reporting, analytics, and integration. Define and validate source-to-target mappings (STTM) for data migration from legacy systems to Guidewire and vice versa. Support data profiling, cleansing, validation, and reconciliation activities. Assist in the design of ETL jobs and data pipelines in collaboration with data engineering teams. Create and maintain business process documentation, use cases, and user stories. Participate in data governance and quality initiatives, ensuring compliance with enterprise standards. Assist with test planning, test case creation, and user acceptance testing (UAT). ","
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
Were Hiring: Backend Engineer (Python) | 3+ Years Experience | Remote/Hybrid Pune | 35 Open Positions Are you a Python engineer with a passion for building scalable, production-grade APIsDo you thrive in high-velocity environments where Test Driven Development (TDD) is the normJoin our team to build real-world servicessuch as Gmail and Jiraas modular, high-performance APIs. Role: Backend Engineer Python Experience Required: 3+ years Location: Remote/Hybrid (Pune) Open Positions: 35 About the Role This is a hands-on software engineering position focused on backend service implementation using Python. Youll be working on creating modular APIs for real-world applications, using a strict TDD approach, and building systems that are both fast and reliable. Key Responsibilities Implement real-world services as modular, production-ready APIs Follow a strict Test Driven Development (TDD) methodology: write tests first, code second Build at speed without compromising reliability or maintainability Design and iterate on scalable, well-structured database schemas Maintain clear, developer-friendly documentation Requirements 3+ years of experience with production-grade Python Hands-on experience with Test Driven Development (TDD) Proven track record of building and scaling large systems at high velocity Strong understanding of database schema design and data modeling Ability to write clean, correct, and maintainable code under tight timelines Nice to Have Familiarity with LLM (Large Language Model) function-calling protocols and service integration paradigms Note: This is a software engineering role. It is not related to data annotation, data science, or analytics. Job Type: Full time Job Location: Pune Experience: 3+ year
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough