Jobs
Interviews

8586 Data Modeling Jobs - Page 35

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 12.0 years

14 - 19 Lacs

Mumbai

Work from Office

Job Overview : We are seeking an experienced Salesforce CRM Technical Architect with a strong background in designing and delivering scalable enterprise-level Salesforce solutions. The ideal candidate should have a minimum of 10 years of Salesforce experience and must be capable of handling end-to-end project architecture, identifying system gaps, and restructuring solutions to optimize performance and alignment with business goals. Key Responsibilities : - Lead the architecture, design, and development of end-to-end Salesforce CRM solutions. - Analyze and review existing implementations, identify technical gaps and performance bottlenecks, and propose improvements. - Work closely with business stakeholders and cross-functional teams to translate business requirements into technical architecture and solution design. - Oversee integration strategies with external systems and ensure data consistency and accuracy across platforms. - Guide development teams on coding standards, best practices, and maintain code quality. - Own and deliver project-level architectural decisions including platform scaling, governance, and security. - Provide leadership in troubleshooting and performance tuning of Salesforce systems. - Mentor and guide developers and junior architects on architecture patterns and practices. Key Skills & Qualifications : - 10+ years of overall experience in Salesforce ecosystem. - Deep expertise in Salesforce Sales Cloud, Service Cloud, Experience Cloud, and Platform customization. - Strong experience with Apex, Visualforce, Lightning Components (LWC), and REST/SOAP APIs. - Proficient in Salesforce Data Modeling, Security, Integration, and Governance. - Ability to conduct code reviews, technical assessments, and enforce design patterns. - Experience in working with tools like VS Code, Git, Jira, Jenkins, etc. - Salesforce Architect certifications are highly desirable (e.g., Application Architect, System Architect, CTA is a plus). - Excellent problem-solving, communication, and leadership skills. Preferred Certifications : - Salesforce Certified Technical Architect (CTA) Preferred - Salesforce Application Architect / System Architect - Salesforce Platform Developer I & II - Salesforce Integration Architecture Designer

Posted 1 week ago

Apply

4.0 - 7.0 years

18 - 20 Lacs

Pune

Hybrid

Job Title: GCP Data Engineer Location: Pune, India Experience: 4 to 7 Years Job Type: Full-Time Job Summary: We are looking for a highly skilled GCP Data Engineer with 4 to 7 years of experience to join our data engineering team in Pune . The ideal candidate should have strong experience working with Google Cloud Platform (GCP) , including Dataproc , Cloud Composer (Apache Airflow) , and must be proficient in Python , SQL , and Apache Spark . The role involves designing, building, and optimizing data pipelines and workflows to support enterprise-grade analytics and data science initiatives. Key Responsibilities: Design and implement scalable and efficient data pipelines on GCP , leveraging Dataproc , BigQuery , Cloud Storage , and Pub/Sub. Develop and manage ETL/ELT workflows using Apache Spark , SQL , and Python. Orchestrate and automate data workflows using Cloud Composer (Apache Airflow). Build batch and streaming data processing jobs that integrate data from various structured and unstructured sources. Optimize pipeline performance and ensure cost-effective data processing. Collaborate with data analysts, scientists, and business teams to understand data requirements and deliver high-quality solutions. Implement and monitor data quality checks, validation, and transformation logic. Required Skills: Strong hands-on experience with Google Cloud Platform (GCP) Proficiency with Dataproc for big data processing and Apache Spark Expertise in Python and SQL for data manipulation and scripting Experience with Cloud Composer / Apache Airflow for workflow orchestration Knowledge of data modeling, warehousing, and pipeline best practices Solid understanding of ETL/ELT architecture and implementation Strong troubleshooting and problem-solving skills Preferred Qualifications: GCP Data Engineer or Cloud Architect Certification. Familiarity with BigQuery , Dataflow , and Pub/Sub. Interested candidates can send your your resume on pranitathapa@onixnet.com

Posted 1 week ago

Apply

7.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

We are looking for a skilled Senior Power BI Analyst to join our team at Apps Associates (I) Pvt. Ltd, with 7-10 years of experience in the IT Services & Consulting industry. Roles and Responsibility Design and develop interactive dashboards using Power BI to provide data-driven insights. Collaborate with stakeholders to understand business requirements and develop reports. Develop and maintain databases, data models, and ETL processes to support reporting needs. Troubleshoot issues with existing reports and resolve them efficiently. Work with cross-functional teams to identify areas for process improvement and implement changes. Stay updated with the latest features and best practices in Power BI and data visualization. Job Requirements Strong understanding of data modeling, database design, and ETL processes. Proficiency in developing complex queries and writing efficient code. Excellent communication and collaboration skills, with the ability to work with stakeholders at all levels. Strong problem-solving skills, with the ability to analyze complex issues and develop creative solutions. Ability to work in a fast-paced environment, delivering high-quality results under tight deadlines. Experience working with large datasets, developing scalable solutions, and optimizing performance.

Posted 1 week ago

Apply

7.0 - 10.0 years

20 - 27 Lacs

Noida

Work from Office

Job Responsibilities: Technical Leadership: • Provide technical leadership and mentorship to a team of data engineers. • Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). • Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. • Conduct code reviews, design reviews, and provide constructive feedback to team members. • Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: • Develop and maintain robust and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. • Implement data quality checks and monitoring systems to ensure data accuracy and integrity. • Collaborate with cross functional teams, and business stakeholders to understand data requirements and deliver data solutions that meet their needs. Platform Building & Maintenance: • Design and implement secure and scalable data storage solutions • Manage and optimize cloud infrastructure costs related to data engineering workloads. • Contribute to the development and maintenance of data engineering tooling and infrastructure to improve team productivity and efficiency. Collaboration & Communication: • Effectively communicate technical designs and concepts to both technical and non-technical audiences. • Collaborate effectively with other engineering teams, product managers, and business stakeholders. • Contribute to knowledge sharing within the team and across the organization. Required Qualifications: • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 7+ years of experience in data engineering and Software Development. • 7+ years of experience of coding in SQL and Python/Java. • 3+ years of hands-on experience building and managing data pipelines in cloud environment like GCP. • Strong programming skills in Python or Java, with experience in developing data-intensive applications. • Expertise in SQL and data modeling techniques for both transactional and analytical workloads. • Experience with CI/CD pipelines and automated testing frameworks. • Excellent communication, interpersonal, and problem-solving skills. • Experience leading or mentoring a team of engineers Roles and Responsibilities Job Responsibilities: Technical Leadership: • Provide technical leadership and mentorship to a team of data engineers. • Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). • Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. • Conduct code reviews, design reviews, and provide constructive feedback to team members. • Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: • Develop and maintain robust and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. • Implement data quality checks and monitoring systems to ensure data accuracy and integrity. • Collaborate with cross functional teams, and business stakeholders to understand data requirements and deliver data solutions that meet their needs. Platform Building & Maintenance: • Design and implement secure and scalable data storage solutions • Manage and optimize cloud infrastructure costs related to data engineering workloads. • Contribute to the development and maintenance of data engineering tooling and infrastructure to improve team productivity and efficiency. Collaboration & Communication: • Effectively communicate technical designs and concepts to both technical and non-technical audiences. • Collaborate effectively with other engineering teams, product managers, and business stakeholders. • Contribute to knowledge sharing within the team and across the organization. Required Qualifications: • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 7+ years of experience in data engineering and Software Development. • 7+ years of experience of coding in SQL and Python/Java. • 3+ years of hands-on experience building and managing data pipelines in cloud environment like GCP. • Strong programming skills in Python or Java, with experience in developing data-intensive applications. • Expertise in SQL and data modeling techniques for both transactional and analytical workloads. • Experience with CI/CD pipelines and automated testing frameworks. • Excellent communication, interpersonal, and problem-solving skills. • Experience leading or mentoring a team of engineers

Posted 1 week ago

Apply

3.0 - 7.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Job Title: Planning IT Analyst Dear Candidates, Greetings from ExxonMobil! Please copy and paste the below link into your browser to apply for the position in the company website. Link to apply: https://jobs.exxonmobil.com/job-invite/80815/ What role you will play in our team We are seeking a skilled Planning IT Analyst to join our team and provide exceptional support for Workday Adaptive Planning as part of our Enterprise Planning Transformation. The ideal candidate will have a strong background in data analysis and visualizations, system integrations, and a passion for leveraging technology to optimize business planning processes Job location is based out of Bengaluru, Karnataka. What you will do Collaborate globally to support key projects and ensure effective communication across time zones. Develop and maintain system integrations for seamless data flow with Workday Adaptive Planning. Utilize SQL skills to optimize integrations and configurations. Facilitate data mapping for accurate and efficient data transfer. Understand integration requirements and deliver solutions with cross-functional teams. Provide on-call support during peak business planning phases. Manage and maintain components like sheets, formulas, access rules, reports, and dashboards within Workday Adaptive Planning. About You Skills and Qualifications Strong understanding of system integrations with data warehouses. Bachelors or masters degree from a recognized university in Computer/IT other relevant engineering disciplines with minimum GPA 7.0 Proficient in implementing data models, data mapping, generating comprehensive reports, and designing interactive dashboards. Intermediate or advanced skill in SQL. Ability to write complex queries, including joins, subqueries, and aggregate functions. Effective collaboration on team-based projects. Minimum 5 years of hands-on experience working as a Planning IT analyst Experience working with SQL, Snowflake, or APIs. Advanced knowledge in web technologies, including backend REST APIs Preferred Qualifications/ Experience Strong understanding of financial planning and analysis processes. Experience in Workday Adaptive Planning or other Enterprise Performance Management (EPM) tools. Comprehensive understanding of APIs including their design, development, and integration. Knowledgeable with annual planning and budgeting, forecasting and variance analysis. Competent in Snowflake’s integration with other data tools and platforms. Familiar with scripting languages and frameworks such as Python, Snowpark, etc. Working knowledge of Cloud and application security; authentication, SSO etc. Thanks, Anita Bhati.

Posted 1 week ago

Apply

8.0 - 13.0 years

10 - 14 Lacs

Noida, Bengaluru

Work from Office

We are looking for Senior MongoDB Developer to join our technology team at Clarivate. The successful candidate will be responsible for the design, implementation, and optimization of MongoDB databases to ensure high performance and availability, both on-premises and in the cloud. About You experience, education, skills and accomplishments At least 8 years of experience in NoSQL databases management such MongoDB (Primary), Azure CosmosDB and AWS DocumentDB, performance tuning, and architecture design. Bachelors degree in computer science, Information Technology, or a related field, or equivalent experience Proven work experience as a MongoDB/Cosmos/DocumentDB Performance, Data Modelling (Document), and Architecture Expert. Experience with database architecture and design to support application development projects. Knowledge of MongoDB/Cosmos/DocumentDB features such as Sharding and Replication. (IMP) Proficient understanding of code versioning tools such as Git. Experience with both on-premises and cloud-based database hosting. It would be great if you also had In-depth understanding of database architectures and data modelling principles Good knowledge of modern DevOps practices and the adoption in database development Experience in dealing with Data migrations from relational to NoSQL. What will you be doing in this role? Design and implement database schemas that represent and support business processes. Develop and maintain database architecture that supports scalability and high availability, both on-premises and in the cloud. Optimize database systems for performance efficiency. Monitor database performance and adjust parameters for optimal database speed. Implement effective security protocols, data protection measures, and storage solutions. Run diagnostic tests and quality assurance procedures. Document processes, systems, and their associated configurations. Working with DBAs and development teams to develop and maintain procedures for database backup and recovery. About the team: A Database Management Service team covers a wide range of tasks essential for maintaining, optimizing, and securing databases. One end of spectrum consists of infrastructure DBAs and the other end being DB specialists (SME) with a Devops mindsight and will partner heavily with the DBAs, Application and devops teams. We've a philosophy and open mindset to learning new technology that enhances the business objectives and we encourage and support growth across the team profile to become deeply knowledgeable in other Databases technologies. Hours of work This is a full-time opportunity with Clarivate. 9 hours per day including lunch break

Posted 1 week ago

Apply

2.0 - 4.0 years

7 - 11 Lacs

Jaipur

Work from Office

Position Overview We are seeking a skilled Data Engineer with 2-4 years of experience to design, build, and maintain scalable data pipelines and infrastructure. You will work with modern data technologies to enable data-driven decision making across the organisation. Key Responsibilities Design and implement ETL/ELT pipelines using Apache Spark and orchestration tools (Airflow/Dagster). Build and optimize data models on Snowflake and cloud platforms. Collaborate with analytics teams to deliver reliable data for reporting and ML initiatives. Monitor pipeline performance, troubleshoot data quality issues, and implement testing frameworks. Contribute to data architecture decisions and work with cross-functional teams to deliver quality data solutions. Required Skills & Experience 2-4 years of experience in data engineering or related field Strong proficiency with Snowflake including data modeling, performance optimisation, and cost management Hands-on experience building data pipelines with Apache Spark (PySpark) Experience with workflow orchestration tools (Airflow, Dagster, or similar) Proficiency with dbt for data transformation, modeling, and testing Proficiency in Python and SQL for data processing and analysis Experience with cloud platforms (AWS, Azure, or GCP) and their data services Understanding of data warehouse concepts, dimensional modeling, and data lake architectures Preferred Qualifications Experience with infrastructure as code tools (Terraform, CloudFormation) Knowledge of streaming technologies (Kafka, Kinesis, Pub/Sub) Familiarity with containerisation (Docker, Kubernetes) Experience with data quality frameworks and monitoring tools Understanding of CI/CD practices for data pipelines Knowledge of data catalog and governance tools Advanced dbt features including macros, packages, and documentation Experience with table format technologies (Apache Iceberg, Apache Hudi) Technical Environment Data Warehouse: Snowflake Processing: Apache Spark, Python, SQL Orchestration: Airflow/Dagster Transformation: dbt Cloud: AWS/Azure/GCP Version Control: Git Monitoring: DataDog, Grafana, or similar

Posted 1 week ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Roles and Responsibility Design and develop interactive dashboards using Power BI to provide data-driven insights. Collaborate with stakeholders to understand business requirements and develop solutions. Develop and maintain databases, data models, and ETL processes to support reporting needs. Create reports, visualizations, and analytics to drive business decisions. Troubleshoot issues and optimize performance for improved efficiency. Work closely with cross-functional teams to ensure seamless integration of Power BI solutions. Job Requirements Strong understanding of data modeling, database design, and ETL concepts. Proficiency in developing complex queries and writing efficient code. Excellent communication skills to effectively collaborate with stakeholders. Ability to work independently and as part of a team to deliver high-quality results. Strong problem-solving skills to analyze complex issues and develop creative solutions. Experience working with large datasets and performing data analysis to drive business insights.

Posted 1 week ago

Apply

8.0 - 13.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project description The Finance Market solutions team requires Senior Axiom Testers to work on FM Re-platforming project. Responsibilities Axiom Testing for Capital Adequacy and credit risk calculation and reporting: Conduct end-to-end testing of LCR, NSFR, Leverage Ratio, Capital Conservation Buffer, Countercyclical Buffer etc. generated by the Axiom Controller View solution Ensure compliance with regulatory requirements for credit risk calculation and reporting, identifying any gaps in data or reporting logic. Collaborate with business analysts, data analysts and vendor (AXIOM) to validate data sources, calculations, and report formats. Test Case Development: Develop and maintain detailed test plans, test cases, and test scripts Identify test data requirements and ensure test environments are accurately set up for Axiom testing scenarios. Create reusable test scripts to automate reporting tests for accuracy, completeness, and consistency. Data Validation and Reconciliation: Validate data extraction, transformation, and loading (ETL) processes to ensure the integrity of credit risk calculation and reporting. Reconcile Axiom reports with source systems and historical reports to ensure accurate regulatory submissions. Defect Management: Identify, log, and track defects using appropriate testing tools, ensuring prompt resolution with the development team. Collaborate with cross-functional teams to troubleshoot and resolve issues related to reporting functionality, calculations, and data integrity. Regulatory Compliance: Stay updated on the latest regulatory requirements for credit risk calculation and reporting. Ensure all testing activities align with applicable regulatory guidelines Collaboration and Reporting: Communicate test results and provide regular progress updates to stakeholders, including risk managers, regulatory teams, and senior leadership. Assist in preparing and submitting documentation for audit and regulatory reviews. Work with the IT team to implement system enhancements and resolve any software-related issues impacting report generation. Skills Must have Overall 8+ years of experience out of which 3-5 years of experience working as a tester with the Axiom Controller View platform (ideally with some experience as Business Analyst as well) Proven experience in testing and validating regulatory reports, particularly COREP, FINREP, Experience in Capital/ Liquidity/ Finstat/ PRA 110 will be beneficial too Experience with regulatory reporting frameworks Technical Skills: Strong expertise in Axiom SL Controller View and its reporting functionalities. Proficiency in SQL and data validation techniques. Familiarity with ETL processes, data modelling, and financial risk reporting systems. Hands-on experience with test management tools (e.g., HP ALM, Jira, or similar). Soft Skills: Excellent problem-solving and analytical skills, with a keen attention to detail. Strong communication skills, capable of articulating complex testing issues to both technical and non-technical stakeholders. Ability to work independently and collaboratively in a fast-paced, regulatory-driven environment. Education : Bachelor's degree in Finance, Accounting, Business, Computer Science, or a related field. Nice to have Nice to Have Skills Description: A relevant certification (e.g., CFA, FRM, PRM) is a plus. Taxonomy knowledge Familiarity with regulatory capital and liquidity frameworks (e.g., Basel III, CRD V/CRR II).

Posted 1 week ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Bengaluru

Hybrid

We are looking for a Senior Healthcare Research and Data Analyst to join our Medtech Insights Team in Bangalore. This is an amazing opportunity to work on syndicated market research reports and custom project for medical device space. About You experience, education, skills, and accomplishments Masters degree (MA, MS in life sciences or MBA)/ Pharm D/ PHD At least 5 years of relevant job experience Advanced proficiency in Microsoft Excel, including using pivot tables and advanced formulas such as VLOOKUP, effective data organization strategies etc. Experienced working on data forecasting models Highly self-motivated and able to work independently or with a team to produce high quality deliverables that require little rework. Strong critical thinking and analytical skills Mature communicator capable of handling high-profile clients from Fortune 500 companies Strong presentation and written communication skills Strong business writing skills Experience on leading projects while collaborating with different teams and information sources It would be great if you also had . . . Well-developed expertise in market research methodologies or related job experience Ability to handle tight deadlines. Excellent creative problem-solving skills. Strong grasp of microeconomics and how businesses work Experience/strong interest in converting unstructured datasets into broadly usable data models. Business acumen that helps support sales success, ongoing improvements, and relationships with internal and external stakeholder. Eye for details and expert in QC of data sets Experience with tools such as Excel, R, Python, or similar is preferred. Experience with project management What will you be doing in this role? Author and audit comprehensive market research reports that provide quantitative and qualitative insights on specific medical device markets around the world. Build market models by leveraging Clarivates proprietary healthcare datasets, while incorporating public financial, regulatory, and government data. Interview industry stakeholders and medical professionals to contextualize data and form insights. Understand the worldview and pain points of Clarivate clients, working closely with them as a problem-solver. From Fortune 500 companies to smaller startups, you will converse with high-profile clients regularly and be expected to both present and defend conclusions to them. Cross functional collaboration on larger projects impacting multiple teams. Contribute your vision; influence the evolution of our products, data models, and data usage strategy. About the Team The team consists of 48 - 50 people and is reporting to the Director, Medtech Insights. We have great skill sets in market research and forecasting and we would love to speak with you if you have skills in data research & analysis and content generation. Hours of Work Hybrid work mode. Mon-Friday (12:00 PM IST 09:00 PM IST)

Posted 1 week ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

Noida

Work from Office

Clarivate is on the lookout for a Sr. Software Engineer ML (machine learning) to join our Patent Service team in Noida . The successful candidate will be responsible focus on supporting machine learning (ML) projects, for deploying, scaling, and maintaining ML models in production environments, working closely with data scientists, ML engineers, and software developers to architect robust infrastructure, implement automation pipelines, and ensure the reliability and scalability of our ML systems. The ideal candidate should be eager to learn, equipped with strong hands-on technical and analytical thinking skills, have a passion for teamwork, and staying updated with the latest technological trends. About You experience, education, skills, and accomplishments Holding a Bachelor's in Engineering or a Master's degree (BE, ME, B.Tech, M.Tech, MCA, MS) with strong communication and reasoning abilities is required. Proven experience as a Machine Learning Engineer or similar position Deep knowledge of math, probability, statistics and algorithms Outstanding analytical and problem-solving skills Understanding of data structures, data modeling and software architecture Good understanding of ML concepts and frameworks (e.g., TensorFlow, Keras, PyTorch) Proficiency with Python and basic libraries for machine learning such as scikit-learn and pandas Expertise in Prompt engineering . Expertise in visualizing and manipulating big datasets Working experience for managing ML workload in production Implement and/ or practicing MLOps or LLMOps concepts Additionally, it would be advantageous if you have: Experience in Terraform or similar, and IAC in general. Familiarity with AWS Bedrock. Experience with OCR engines and solutions, e.g. AWS Textract, Google Cloud Vision. Interest in exploring and adopting Data Science methodologies, and AI/ML technologies to optimize project outcomes. Experience working in a CI/CD setup with multiple environments, and with an ability to manage code and deployments towards incrementally faster releases. Experience with RDBMS and NoSQL databases, particularly MySQL or PostgreSQL. What will you be doing in this role? Overall, you will play a pivotal role in driving the success of the development projects and achieving business objectives through innovative and efficient agile software development practices. Designing and developing machine learning systems Implementing appropriate ML algorithms, analyzing ML algorithms that could be used to solve a given problem and ranking them by their success probability Running machine learning tests and experiments, perform statistical analysis and fine-tuning using test results, training and retraining systems when necessary Implement monitoring and alerting systems to track the performance and health of ML models in production. Ensure security best practices are followed in the deployment and management of ML systems. Optimize infrastructure for performance, scalability, and cost efficiency. Develop and maintain CI/CD pipelines for automated model training, testing, and deployment. Troubleshoot issues related to infrastructure, deployments, and performance of ML models. Stay up to date with the latest advancements in ML technologies, and evaluate their potential impact on our workflows. About the Team Our team comprises driven professionals who are deeply committed to leveraging technology to make a tangible impact in our field of the patent services area . Joining us, you'll thrive in a multi-region, cross-cultural environment, collaborating on cutting-edge technologies with a strong emphasis on a user-centric approach.

Posted 1 week ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Bengaluru

Hybrid

About Clarivate Clarivate provides innovative data and analytical solutions to the largest biopharmaceutical and medical technology companies in the world. Clarivates Medtech Data team harnesses real-world healthcare data and identifies meaningful insights from large data and metadata sources to help medical device companies make some of their most important business decisions. Who are you? You are passionate about data and have at least 2 years of hands-on experience wrangling through large data using SQL/ Python. You are also an effective communicator who can explain complex ideas using clear and concise language, including through written communication. You are comfortable collaborating with a diverse group of internal colleagues, including subject matter experts, product managers, sales leaders, technical experts, and other client-facing analysts. You are solution-oriented and understand the importance of timely execution while juggling multiple priorities. What will you do? Understand the worldview and pain points of Clarivate customers, working closely with the stake holders as a problem-solver. Effectively build, mine, and manage multiple datasets, which will be required for market modelling and analysis. Maintain existing Medtech product catalogs. Research and understand novel device markets, including major competitors, uses, and product segmentation. Evaluate data outputs for market trends and draw insights; correct any potential errors and issues. Identify opportunities and issues for data analysis and experiments, with bias towards driving customer delight. Work with clients to understand the business requirements and provide data driven insights. Contribute your vision; influence the evolution of our products, data models, and data usage strategy. What do you know? You have strong quantitative foundations, as evidenced by your educational and professional background. You are more than proficient with Excel, SQL and want to continue to improve your skills. You have a background and/or interest in Life Sciences and are keen to learn a lot more about medical devices and supplies in Latin American region. You know how to deliver an effective presentation. Requirements: Expertise in understanding data variables and connecting the dots in various datasets. Expertise in handling and manipulating large datasets. Proficiency with written and oral communications and must be able to communicate complex quantitative ideas to internal and external stakeholders. Should have handled short/long term projects end to end. Skills: Expertise in MS SQL. Strong MS Office and Excel skills is mandatory. Analytical Skills Ability to do secondary research and synthesize the findings. Problem solving ability. Education: Any graduate/ Post-graduate, B.E or M.Sc. in the disciplines of Biotechnology, Medical Electronics, Pharmacy preferred. Preferred (Good to have) skills: Understanding of Medtech data and Healthcare in the Latin American countries Exposure to Medtech or Claims/Pharmacy data. Experience working with data from data vendors. Expertise in anyone (or more) of the data analysis tools/languages such as R/Python is a plus. Work Mode: Hybrid, Monday to Friday 12:00 pm to 8:00 pm

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Hyderabad, Bengaluru, Secunderabad

Work from Office

Job Summary The Business Analyst (BA) acts as a liaison between either a development group and an external client/vendor, or adevelopment group and an internal business sponsor.The BA is responsible for the collection/analysis, authoring, and communication of requirements in order to satisfy the client/sponsors needs. Depending on the nature of the project, the BA may also undertake a combination of project management, technical implementation and testing analyst tasks. The BA is expected to have either a strong proprietary knowledge base or multiple years of previous BA experience in requirements analysis and authoring. The Lead Business Analyst is additionally expected to exhibit strong leadership in many areas including defining/improving standards, mentoring other Senior and Associate BAs, and implementation / project management of critical projects. Experience, Education, Skills, And Accomplishments Bachelors Degree with minimum 5 years of experience as a technical business analyst or 6 years of relevant experience Process engineering and workflow design, high-level technical design, requirements gathering and specification elaboration. Authoring back-end specifications/ technical requirements (Databases,API,services,service integration,JSON/message queues) It would be great if you also have. JIRA, Confluence Data mapping, standardization, and migration New product/ green-field developmentprojects Thriving in an environment where the BA drives SDLC from Ideation through requirements/specifications,supports dev & QA,supports UAT,supports/guides business implementation. AWS Process automation What will you be doing in this role? Developsand improves business processes within the technology and business organizations and understands client requirements, specifying and analyzing these to a sufficient level of detail to ensure clarity of definition. Collects, writes formal specifications and communicates business requirements between development and client to design and implement business solutions. Responsible forbuilding and maintaining a relationship with Data Provider e.g. Exchanges, Contributors. Responsible for the collection, analysis and documentation of a client's business needs and requirements. Participates in short term planning sessions with a client to improve a business process within an assigned client area. Uses a structured requirements process to assess near-term needs. Uses a structured change management process to shepherd projects from requirements gathering through design, testing, implementation,client transition and on-going systems maintenance Provides business process and systems analysis, requirements specifications, consult on development and testing management for implementing technology-based editorial business solutions focusing on increasing productivity, data accuracy, automation and efficiency while reducing redundancy and costs. Responsible for modeling and analyzing client and system processes in order to identify opportunities for technological improvements, process synergies, and business efficiencies Identify, recommend and develop methods, tools and metrics for client business process and systems operational support Provide client systems support in order to resolve issues and contribute to on-going systems maintenance and evolution. Identifies business priorities and advises client on options. Ensures change management and communication of change is done in a systematic way for projects where initial requirements may evolve during the lifecycle of the project. Responsible for generating systems documentation for operational support and end user information.Conducts operational and end user training and support transitions of operational support to client. Develops relationships with a client by being proactive, displays a thorough understanding of their business, and provides innovative business solutions.Works with clients to ensure smooth transitions to new systems and/or business processes. Recommends metrics to ensure customer satisfaction.

Posted 1 week ago

Apply

6.0 - 8.0 years

1 - 6 Lacs

Kolkata, Pune

Work from Office

Job Title: Developer Work Location: Pune -MH,Kolkata -WB Skill Required: ORACLE ,SQL, Databricks Experience Range : 6-8 Years Job Description: Financial Crime experience, SQL, Data Modeling, System Analysis, Databricks engineering and architecture, Database admiration, project and resource planning and management Essential Skills: Financial Crime experience, SQL, Data Modeling, System Analysis, Databricks engineering and architecture, Database admiration, project and resource planning and management Desirable Skills:Pyspark

Posted 1 week ago

Apply

7.0 - 12.0 years

14 - 24 Lacs

Hyderabad

Remote

7+ years of overall experience, with 3+ years in Celonis EMS implementation and process mining projects . Strong expertise in data modeling , event log extraction , ETL , and Celonis PQL scripting . Hands-on experience integrating Celonis with Oracle, Salesforce, or ServiceNow systems. Proven ability to build dashboards, KPIs, and action flows to identify inefficiencies and drive process improvements. Skilled in stakeholder management, requirement gathering, and delivering value-driven solutions using Celonis. Excellent communication and analytical skills to work across business and technical teams

Posted 1 week ago

Apply

8.0 - 12.0 years

14 - 19 Lacs

Hyderabad

Work from Office

The Role Were looking for expert in Advanced NetApp L3 level support, CVO ONTAP, ITIL processes (Change Management, Problem Management) and experience in implementing automation. The role requires advanced troubleshooting skills and the ability to optimize NAS systems for high-performance operations. Office Location: Hyderabad (Work from Office) Key Responsibilities: Lead SQL development: Write and maintain complex queries, stored procedures, triggers, functions, and manage views for high-performance data operations. Database Optimization: Design and implement efficient indexing and partitioning strategies to improve query speed and system performance. Advanced Python Programming: Use Python to build ETL pipelines, automate data workflows, and conduct data wrangling and analysis. Data Modeling & Architecture: Develop scalable data models and work closely with engineering to ensure alignment with database design best practices. Business Insights: Translate large and complex datasets into clear, actionable insights to support strategic business decisions. Cross-functional Collaboration: Partner with product, tech, and business stakeholders to gather requirements and deliver analytics solutions. Mentorship & Review: Guide junior analysts and ensure adherence to coding standards, data quality, and best practices. Visualization & Reporting: Create dashboards and reports using tools like Power BI, Tableau, or Python-based libraries (e.g., Plotly, Matplotlib). Agile & Version Control: Operate in Agile environments with proficiency in Git, JIRA, and continuous integration for data workflows.

Posted 1 week ago

Apply

8.0 - 13.0 years

9 - 13 Lacs

Noida, Bengaluru

Hybrid

We are looking for Lead Database Developer-Oracle to join our technology team at Clarivate. The Lead Database Developer will be responsible for overseeing the design, development, and maintenance of high-performance databases using Oracle and PostgreSQL. About You experience, education, skills and accomplishments Bachelors degree in computer science, Information Technology, or a related field, or equivalent experience At least 8 years of experience in Oracle database environments and PostgreSQL. At least 8 years of experience in database performance tuning, query optimization and capacity planning. At least 4 years of experience in cloud-based database services, such as AWS RDS Solid understanding of data security, backup, and recovery procedures. Strong understanding of relational database concepts including primary/foreign keys, many-to-many relationships, and complex joins operations. Experience in system analysis and design, problem-solving, support and troubleshooting. Experience with cloud database platforms (e.g., AWS RDS, Azure Cloud) is a plus. It would be great if you also had In-depth understanding of database architecture and data modelling principles Good knowledge of No-SQL database solutions, AWS and Azure Db solutions and services What will you be doing in this role? Database Design and Development: Collaborate with the development team to design and implement efficient database structures that meet the organization's requirements. Develop and maintain database schemas, tables, views, stored procedures, and functions. Database Performance Tuning: Monitor and analyze database performance, identifying and resolving bottlenecks or other performance issues. Optimize queries, indexes, data schemas, and database configurations to enhance system performance. Build a proactive service engagement, understanding the data base workloads and data model, to evaluate ASH and AWR and other diagnostic tools to ensure a continued stable and performant database. Data Security and Integrity: Working with DevOps, implement and maintain database security measures to protect sensitive data from unauthorized access or breaches. Data Integration and ETL: Develop and maintain data integration processes, including Extract, Transform, and Load (ETL) workflows. Collaborate with developers to ensure accurate and timely data extraction, transformation, and loading across various systems. Database Documentation and Reporting: Create and maintain comprehensive documentation for database environments, including data dictionaries, schemas, configurations, and standard operating procedures. Generate regular reports on database performance, capacity, and usage. Database Backup, Recovery, and Disaster Planning: Working in collaboration with DBA's and DevOps, develop and maintain database backup and recovery strategies to ensure data integrity and availability. Ensure regular backups, test data recovery processes, and create disaster recovery plans. Collaboration and Support: Work closely with cross-functional teams, including developers, system administrators, and business stakeholders, to understand their database requirements and provide technical support.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Bengaluru

Hybrid

The Senior Solutions Analyst acts as a liaison between either a development group and an external client/vendor, or adevelopment group and an internal business sponsor.The Solutions Analyst is responsible for the collection/analysis, authoring, and communication of requirements in order to satisfy the client/sponsors needs. Depending on the nature of the project, also undertake a combination of project management, technical implementation and testing analyst tasks. Experience, Education, Skills, And Accomplishments Bachelors Degree with minimum 5 years of experience as a technical business analyst Process engineering and workflow design, high-level technical design, requirements gathering and specification elaboration. Authoring back-end specifications/ technical requirements (Databases,API,services,serviceintegration,JSON/message queues) It would be great if you also have. JIRA, Confluence Data mapping, standardization, and migration New product/ green-field developmentprojects Thriving in an environment where the BA drives SDLC from Ideation through requirements/specifications,supports dev & QA,supports UAT,supports/guides business implementation. AWS What will you be doing in this role? Developsand improves business processes within the technology and business organizations and understands client requirements, specifying and analyzing these to a sufficient level of detail to ensure clarity of definition. Collects, writes formal specifications and communicates business requirements between development and client to design and implement business solutions. Responsible forbuilding and maintaining a relationship with Data Provider e.g. Exchanges, Contributors. Responsible for the collection, analysis and documentation of a client's business needs and requirements. Participates in short term planning sessions with a client to improve a business process within an assigned client area. Uses a structured requirements process to assess near-term needs. Uses a structured change management process to shepherd projects from requirements gathering through design, testing, implementation,client transition and on-going systems maintenance Provides business process and systems analysis, requirements specifications, consult on development and testing management for implementing technology-based editorial business solutions focusing on increasing productivity, data accuracy, automation and efficiency while reducing redundancy and costs. Responsible for modeling and analyzing client and system processes in order to identify opportunities for technological improvements, process synergies, and business efficiencies

Posted 1 week ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role - Data Analyst (Pyspark/SQL) Location - Bengalore Type - Hybrid Position - Full Time We are looking a Data Analyst who has strong expertise into PySpark & SQL Roles and Responsibilities Develop expertise in SQL queries for complex data analysis and troubleshooting issues related to data extraction, manipulation, transformation, mining, processing, wrangling, reporting, modeling, classification. Desired Candidate Profile 4-9 years of experience in Data Analytics with a strong background in PySpark programming language.

Posted 1 week ago

Apply

4.0 - 5.0 years

3 - 7 Lacs

Chennai

Work from Office

The Finance Market solutions team requires Axiom Testers to work on FM Re-platforming project. Responsibilities Axiom Testing for Capital Adequacy and credit risk calculation and reporting: Conduct end-to-end testing of LCR, NSFR, Leverage Ratio, Capital Conservation Buffer, Countercyclical Buffer etc. generated by the Axiom Controller View solution Ensure compliance with regulatory requirements for credit risk calculation and reporting, identifying any gaps in data or reporting logic. Collaborate with business analysts, data analysts and vendor (AXIOM) to validate data sources, calculations, and report formats. Test Case Development: Develop and maintain detailed test plans, test cases, and test scripts Identify test data requirements and ensure test environments are accurately set up for Axiom testing scenarios. Create reusable test scripts to automate reporting tests for accuracy, completeness, and consistency. Data Validation and Reconciliation: Validate data extraction, transformation, and loading (ETL) processes to ensure the integrity of credit risk calculation and reporting. Reconcile Axiom reports with source systems and historical reports to ensure accurate regulatory submissions. Defect Management: Identify, log, and track defects using appropriate testing tools, ensuring prompt resolution with the development team. Collaborate with cross-functional teams to troubleshoot and resolve issues related to reporting functionality, calculations, and data integrity. Regulatory Compliance: Stay updated on the latest regulatory requirements for credit risk calculation and reporting. Ensure all testing activities align with applicable regulatory guidelines Collaboration and Reporting: Communicate test results and provide regular progress updates to stakeholders, including risk managers, regulatory teams, and senior leadership. Assist in preparing and submitting documentation for audit and regulatory reviews. Work with the IT team to implement system enhancements and resolve any software-related issues impacting report generation. Skills Must have Overall 4 to 5 years of experience out of which 2-3 years of experience working as a tester with the Axiom Controller View platform Proven experience in testing and validating regulatory reports, particularly COREP, FINREP, Experience in Capital/ Liquidity/ Finstat/ PRA 110 will be beneficial too Experience with regulatory reporting frameworks Technical Skills: Strong expertise in Axiom SL Controller View and its reporting functionalities. Proficiency in SQL and data validation techniques. Familiarity with ETL processes, data modelling, and financial risk reporting systems. Hands-on experience with test management tools (e.g., HP ALM, Jira, or similar). Soft Skills: Excellent problem-solving and analytical skills, with a keen attention to detail. Strong communication skills, capable of articulating complex testing issues to both technical and non-technical stakeholders. Ability to work independently and collaboratively in a fast-paced, regulatory-driven environment. Education: Bachelor's degree in Finance, Accounting, Business, Computer Science, or a related field. Nice to have Nice to Have Skills Description: A relevant certification (e.g., CFA, FRM, PRM) is a plus. Taxonomy knowledge Familiarity with regulatory capital and liquidity frameworks (e.g., Basel III, CRD V/CRR II).

Posted 1 week ago

Apply

4.0 - 5.0 years

7 - 12 Lacs

Bengaluru

Work from Office

The Finance Market solutions team requires Axiom Testers to work on FM Re-platforming project. Responsibilities Axiom Testing for Capital Adequacy and credit risk calculation and reporting: Conduct end-to-end testing of LCR, NSFR, Leverage Ratio, Capital Conservation Buffer, Countercyclical Buffer etc. generated by the Axiom Controller View solution Ensure compliance with regulatory requirements for credit risk calculation and reporting, identifying any gaps in data or reporting logic. Collaborate with business analysts, data analysts and vendor (AXIOM) to validate data sources, calculations, and report formats. Test Case Development: Develop and maintain detailed test plans, test cases, and test scripts Identify test data requirements and ensure test environments are accurately set up for Axiom testing scenarios. Create reusable test scripts to automate reporting tests for accuracy, completeness, and consistency. Data Validation and Reconciliation: Validate data extraction, transformation, and loading (ETL) processes to ensure the integrity of credit risk calculation and reporting. Reconcile Axiom reports with source systems and historical reports to ensure accurate regulatory submissions. Defect Management: Identify, log, and track defects using appropriate testing tools, ensuring prompt resolution with the development team. Collaborate with cross-functional teams to troubleshoot and resolve issues related to reporting functionality, calculations, and data integrity. Regulatory Compliance: Stay updated on the latest regulatory requirements for credit risk calculation and reporting. Ensure all testing activities align with applicable regulatory guidelines Collaboration and Reporting: Communicate test results and provide regular progress updates to stakeholders, including risk managers, regulatory teams, and senior leadership. Assist in preparing and submitting documentation for audit and regulatory reviews. Work with the IT team to implement system enhancements and resolve any software-related issues impacting report generation. Skills Must have Overall 4 to 5 years of experience out of which 2-3 years of experience working as a tester with the Axiom Controller View platform Proven experience in testing and validating regulatory reports, particularly COREP, FINREP, Experience in Capital/ Liquidity/ Finstat/ PRA 110 will be beneficial too Experience with regulatory reporting frameworks Technical Skills: Strong expertise in Axiom SL Controller View and its reporting functionalities. Proficiency in SQL and data validation techniques. Familiarity with ETL processes, data modelling, and financial risk reporting systems. Hands-on experience with test management tools (e.g., HP ALM, Jira, or similar). Soft Skills: Excellent problem-solving and analytical skills, with a keen attention to detail. Strong communication skills, capable of articulating complex testing issues to both technical and non-technical stakeholders. Ability to work independently and collaboratively in a fast-paced, regulatory-driven environment. Education: Bachelor's degree in Finance, Accounting, Business, Computer Science, or a related field. Nice to have Nice to Have Skills Description: A relevant certification (e.g., CFA, FRM, PRM) is a plus. Taxonomy knowledge Familiarity with regulatory capital and liquidity frameworks (e.g., Basel III, CRD V/CRR II).

Posted 1 week ago

Apply

3.0 - 8.0 years

15 - 19 Lacs

Mumbai

Work from Office

Project description As an Engineer within the Public Markets Technology Department, you will play a pivotal role in developing and enhancing best-in-class applications that support global investment and co-investment strategies. This role involves close collaboration with both technology and business teams to modernize and evolve the technology landscape, enabling strategic and operational excellence. Responsibilities PMWB/CosmosoProvide operational support for PMs in London and Hongkong for Order Entry and Position Management. Currently in scope is EPM, SSG, and AE. Future expansion for LRA in London and Australia. EPM ValidatoroOnboard, Enhance and Maintain Fund Performance data for users in London and Hongkong EPM ExploreroNAV and Performance loading configuration changes. This requires Python code and config changes. BatchoCosmos and PMWB SOD SupportoEPM Risk DataoData Fabric CMF Pipelines expand monitoring to anticipate abnormal run times. SupportoPartner with the London ARS-FICC and FCT teams to make judgement calls and address failures independent of Toronto office Skills Must have University degree in Engineering or Computer Science preferred. 3+ years experience in software development. Strong knowledge and demonstrated experience with Python is a must Experience with Java is an asset. Experience with Relational and Non-Relational Databases is an asset. Demonstrated experience developing applications on AWS. AWS certification is preferred. Experience in the capital markets industry is a nice to have, including knowledge of various financial products and derivatives. Strong desire to learn how the business operates and how technology helps them achieve their goals. Must have an entrepreneurial attitude and can work in a fast-paced environment and manage competing priorities. Experience working in an Agile environment. Ability and willingness to adapt and contribute as needed to ensure the team meets its goals. Nice to have Public Markets Technology

Posted 1 week ago

Apply

8.0 - 13.0 years

5 - 10 Lacs

Chennai

Work from Office

Project description The Finance Market solutions team requires Senior Axiom Testers to work on FM Re-platforming project. Responsibilities Axiom Testing for Capital Adequacy and credit risk calculation and reporting: Conduct end-to-end testing of LCR, NSFR, Leverage Ratio, Capital Conservation Buffer, Countercyclical Buffer etc. generated by the Axiom Controller View solution Ensure compliance with regulatory requirements for credit risk calculation and reporting, identifying any gaps in data or reporting logic. Collaborate with business analysts, data analysts and vendor (AXIOM) to validate data sources, calculations, and report formats. Test Case Development: Develop and maintain detailed test plans, test cases, and test scripts Identify test data requirements and ensure test environments are accurately set up for Axiom testing scenarios. Create reusable test scripts to automate reporting tests for accuracy, completeness, and consistency. Data Validation and Reconciliation: Validate data extraction, transformation, and loading (ETL) processes to ensure the integrity of credit risk calculation and reporting. Reconcile Axiom reports with source systems and historical reports to ensure accurate regulatory submissions. Defect Management: Identify, log, and track defects using appropriate testing tools, ensuring prompt resolution with the development team. Collaborate with cross-functional teams to troubleshoot and resolve issues related to reporting functionality, calculations, and data integrity. Regulatory Compliance: Stay updated on the latest regulatory requirements for credit risk calculation and reporting. Ensure all testing activities align with applicable regulatory guidelines Collaboration and Reporting: Communicate test results and provide regular progress updates to stakeholders, including risk managers, regulatory teams, and senior leadership. Assist in preparing and submitting documentation for audit and regulatory reviews. Work with the IT team to implement system enhancements and resolve any software-related issues impacting report generation. Skills Must have Overall 8+ years of experience out of which 3-5 years of experience working as a tester with the Axiom Controller View platform (ideally with some experience as Business Analyst as well) Proven experience in testing and validating regulatory reports, particularly COREP, FINREP, Experience in Capital/ Liquidity/ Finstat/ PRA 110 will be beneficial too Experience with regulatory reporting frameworks Technical Skills: Strong expertise in Axiom SL Controller View and its reporting functionalities. Proficiency in SQL and data validation techniques. Familiarity with ETL processes, data modelling, and financial risk reporting systems. Hands-on experience with test management tools (e.g., HP ALM, Jira, or similar). Soft Skills: Excellent problem-solving and analytical skills, with a keen attention to detail. Strong communication skills, capable of articulating complex testing issues to both technical and non-technical stakeholders. Ability to work independently and collaboratively in a fast-paced, regulatory-driven environment. Education: Bachelor's degree in Finance, Accounting, Business, Computer Science, or a related field. Nice to have Nice to Have Skills Description: A relevant certification (e.g., CFA, FRM, PRM) is a plus. Taxonomy knowledge Familiarity with regulatory capital and liquidity frameworks (e.g., Basel III, CRD V/CRR II).

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. The current engineering focus is to modernize the architecture for better scalability and orchestration compatibility, refactor core services, and lay the foundation for future AI-based enhancements. This is a pivotal development initiative with clear roadmap milestones and direct alignment with a multi-year digital transformation strategy. We are looking for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join a newly established scrum team responsible for enhancing a core data contextualization platform. This service plays a central role in associating and matching data from diverse sources - time series, equipment, documents, 3D objects - into a unified data model. As a Senior Backend Engineer, you will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This is a high-impact role contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Key Responsibilities: - Design, develop, and maintain scalable, API-driven backend services using Kotlin. - Align backend systems with modern data modeling and orchestration standards. - Collaborate with engineering, product, and design teams to ensure seamless integration across the broader data platform. - Implement and refine RESTful APIs following established design guidelines. - Participate in architecture planning, technical discovery, and integration design for improved platform compatibility and maintainability. - Conduct load testing, improve unit test coverage, and contribute to reliability engineering efforts. - Drive software development best practices including code reviews, documentation, and CI/CD process adherence. - Ensure compliance with multi-cloud design standards and use of infrastructure-as-code tooling (Kubernetes, Terraform). Qualifications: - 7+ years of backend development experience, with a strong focus on Kotlin - Proven ability to design and maintain robust, API-centric microservices. - Hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows. - Solid knowledge of PostgreSQL, Elasticsearch, and object storage systems. - Strong understanding of distributed systems, data modeling, and software scalability principles. - Excellent communication skills and ability to work in a cross-functional, English-speaking environment. - Bachelor's or Master's degree in Computer Science or related discipline. Bonus Qualifications: - Experience with Python for auxiliary services, data processing, or SDK usage. - Knowledge of data contextualization or entity resolution techniques. - Familiarity with 3D data models, industrial data structures, or hierarchical asset relationships. - Exposure to LLM-based matching or AI-enhanced data processing (not required but a plus). - Experience with Terraform, Prometheus, and scalable backend performance testing. About the role and key responsibilities: - Develop Data Fusion - a robust, state-of-the-art SaaS for industrial data. - Solve concrete industrial data problems by designing and implementing delightful APIs and robust services on top of Data Fusion. - Work with application teams to ensure a delightful user experience that helps the user solve complex real-world problems. - Work with distributed open-source software such as Kubernetes, Kafka, Spark, and similar to build scalable and performant solutions. - Help shape the culture and methodology of a rapidly growing company. GlobalLogic offers a culture of caring, learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust organization where integrity is key. Join us to be part of a trusted digital engineering partner creating innovative digital products and experiences.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

thane, maharashtra

On-site

As a seasoned Salesforce Developer at our fast-growing international IT company, you will be responsible for developing and maintaining custom Salesforce applications using Apex, LWC, Triggers, and Visualforce. Your role will involve building scalable, secure, and reusable components that adhere to Salesforce best practices. You will integrate Salesforce with internal and external systems through REST/SOAP APIs and middleware. Throughout the development lifecycle, you will participate in technical design, development, testing, deployment, and support phases. Collaboration is key, as you will work closely with functional teams to gather requirements and translate them into technical designs. In addition to your development responsibilities, you will also mentor junior developers and conduct peer code reviews. Ensuring code quality through unit testing, performance tuning, and adherence to governance policies will be essential. You will also support deployment activities and troubleshoot any post-deployment issues that may arise. To excel in this role, you should hold a Bachelor's degree in Computer Science, Engineering, or a related field, along with at least 6 years of hands-on experience in Salesforce platform development. Your expertise should encompass Apex, SOQL, SOSL, Lightning Web Components (LWC), and Visualforce. A strong understanding of Salesforce architecture, data modeling, security, and performance optimization is crucial. Experience in building integrations using Salesforce APIs and working with external systems is highly valued. Familiarity with tools such as Data Loader, Workbench, Developer Console, and VS Code with Salesforce CLI is expected. Proficiency in version control using Git and CI/CD tools like Copado, Gearset, or Jenkins will be beneficial. Holding a Salesforce Platform Developer I certification is mandatory for this role, with Platform Developer II certification being preferred. Join us at NTT DATA Business Solutions and be a part of a team that transforms SAP solutions into value. If you have any questions regarding this opportunity, please feel free to reach out to our Recruiter, Pragya Kalra, at Pragya.Kalra@nttdata.com.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies