Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
The IQVIA Digital Data team is growing and looking for super curious, passionate, and driven individuals to join the team. Our people are our greatest asset and were committed to creating an environment where we all thrive doing what we love. IQVIA Digital is the world s most comprehensive platform for healthcare marketing and analytics. It is changing the way Healthcare Marketing is done by leveraging latest cloud-native solutions, efficient big data pipelines, processes and technologies. We work with the largest pharmaceutical brands and media agencies in the US. We empower media planning, buying, and analytics teams with the tools they need to do their job, and do it well. By simplifying workflows that used to take days into seconds, integrating functionality that used to require multiple vendors into one, and providing faster and deeper insights than anyone in the industry, we are helping healthcare marketers cut their costs, move faster and drive measurable results. We are looking for a driven and dynamic Senior Data Engineer who will have the responsibility to expand and maintain our data warehouse, develop scalable data products, and help orchestrate terabytes of data flowing through our platform. This individual will work directly with a group of cross-functional engineers and product owners in our reporting and statistical aggregations, leveraging best practice engineering standards to ensure secure and successful data solutions. About the Job - Construct data pipelines using Airflow and Cloud Functions to meet business requirements set from the Reporting Product and Engineering teams - Maintain and optimize table schemas, views, and queries in our data warehouse and databases - Perform ad-hoc analysis to troubleshoot stakeholder issues surrounding data and provide insights into feature usage - Document data architecture and integration efforts to provide a clear understanding of the data platform to other team members - Provide guidance on data best practices when building out new product lines - Mentor a team of engineers Must have - Experience with data task orchestration (Airflow, CRON, Prefect etc. . ) with dependency mapping - Data analysis and data modeling - Strong experience in Python, SQL, shell scripting - Experience interacting with APIs, SFTP, Cloud Storage locations (eg. GCS, s3) - Analytical problem-solving skills, ability to analyze application logs to troubleshoot issues - Familiarity with Cloud Computing (GCP a plus) -Experience developing and implementing statistical models Nice to have - Experience with JavaScript - Hands on work with Airflow - Exposure to producer/consumer messaging systems - Has led a small team of developers IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide . Learn more at https://jobs. iqvia. com
Posted 1 week ago
3.0 - 7.0 years
7 - 11 Lacs
Pune
Work from Office
Join a dynamic engineering team to contribute to the development and implementation of scalable digital commerce solutions. This role involves working across the full stack, with a focus on Shopify and other commerce platforms, ensuring performant, secure, and maintainable solutions for global clients. Job Description: Key Responsibilities Full Stack Development: Develop and maintain eCommerce platforms using Shopify and other technologies such as Adobe Commerce or custom stacks. Implement and support headless commerce architectures using Shopify Hydrogen, Storefront API, and GraphQL. Build responsive frontend interfaces using React, Next. js, or Angular. Design backend services and APIs with Node. js, Express, or similar frameworks. Integration & Cloud: Integrate third-party systems including payment gateways, ERP, CMS, and analytics tools. Collaborate on deployment strategies using cloud platforms like AWS, GCP, or Azure. Support CI/CD pipelines and DevOps best practices. Code Quality & Collaboration: Follow best practices in coding, testing, and documentation. Work closely with senior engineers, architects, and designers to deliver high-quality features. Participate in code reviews and knowledge-sharing sessions. Client & Team Interaction: Communicate technical solutions clearly to stakeholders. Collaborate with cross-functional teams in agile environments. Take ownership of deliverables and contribute to sprint planning and estimation. Qualifications & Skills Experience: 3+ years of professional experience in full stack development. Hands-on experience with eCommerce platforms, especially Shopify (Shopify+, Hydrogen, Storefront API). Exposure to Adobe Commerce, SAP Commerce, or custom commerce platforms is a plus. Technical Skills: Proficient in modern frontend frameworks: React. js, Next. js, or Angular. Skilled in backend development with Node. js, Express. js; bonus for Java or . NET exposure. Good understanding of REST/GraphQL APIs, authentication, and data modeling. Familiarity with Git, CI/CD tools, and DevOps workflows. Basic experience with cloud services (AWS/GCP/Azure) for deployments and hosting. Mindset & Soft Skills: Strong problem-solving and debugging skills. Detail-oriented, quality-conscious, and eager to learn. Team player with good communication and collaboration abilities. Passionate about eCommerce technology and user experience. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 week ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Country India Number of Openings* 1 Approved ECMS RQ# 533568 Duration of contract* 6 Months Total Yrs. of Experience* 8+ years Relevant Yrs. of experience* 8+ years Detailed JD *(Roles and Responsibilities) We are seeking a highly skilled and experienced Database Developer to join our team. The ideal candidate will have a strong background in SQL, SQL Server, BigQuery, Data Modelling, SSIS, and ETL processes. You will be responsible for designing, developing, and maintaining robust database solutions that support business operations and analytics. Key Responsibilities: > Design and implement efficient database solutions and models to store and retrieve company data. > Develop and optimize SQL queries, stored procedures, and functions. > Work with SQL Server and BigQuery to manage large datasets and ensure data integrity. > Build and maintain ETL pipelines using SSIS and other tools. > Collaborate with data analysts, software developers, and business stakeholders to understand data requirements. > Perform data profiling, cleansing, and transformation to support analytics and reporting. > Monitor database performance and implement improvements. > Ensure security and compliance standards are met in all database solutions. Required Skills & Qualifications: > 8 12 years of hands-on experience in database development. Strong proficiency in SQL and SQL Server. > Experience with Google BigQuery and cloud-based data solutions. > Expertise in Data Modelling and relational database design. Proficient in SSIS and ETL development. > Solid understanding of performance tuning and optimization techniques. > Excellent problem-solving and analytical skills. > Strong communication and collaboration abilities. Mandatory skills* SQL, SQL Server, BIG Query, , SSIS Desired skills* Data Modelling, ETL Domain* Payments Client name (for internal purpose only)* NatWest Approx. vendor billing rate(INR /Day) 10000 INR/Day Work Location* Chennai or Bangalore or Gurgaon Background check process to be followed: * Yes Before onboarding / After onboarding: * Before Onboarding BGV Agency: * Any Nascom approved
Posted 1 week ago
4.0 - 7.0 years
20 - 32 Lacs
Bengaluru
Work from Office
ECMS ID/ Title Number of Openings 3 Duration of contract 6 No of years experience Relevant 5+ Years and Total 8+ years . Detailed job description - Skill Set: Attached Mandatory Skills* Power Bi /UI developer Good to Have Skills Power BI Vendor Billing range 6000- 9000/Day Remote option available: Yes/ No Hybrid Mode Work location: Chennai Start date: Immediate Client Interview / F2F Applicable yes Background check process to be followed: Before onboarding / After onboarding: BGV Agency: Pre- ( 1 Month BGV) Master degree or equivalent experience. Minimum of 5 years of experience in data visualization, UI development, or related roles. Relevant certifications in PowerBI, data analysis, or related technologies are preferred are a plus. Proven track record of developing PowerBI dashboards and reports to meet business requirements. Strong attention to detail and commitment to delivering high-quality user interfaces. Strong attention to detail and commitment to delivering high-quality design solutions. Experience in user testing and feedback incorporation to improve design and functionality. Ability to analyze complex data sets and derive meaningful insights to support business decision-making. Experience in collaborating with data scientists, analysts, and business stakeholders as well as development, operations, and security teams to deliver data-driven solutions. Ability to work effectively in a team-oriented environment. Strong communication skills to articulate technical concepts to non-technical stakeholders. Demonstrated ability to identify and resolve technical issues efficiently. Innovative mindset with a focus on continuous improvement and automation. Ability to adapt to new technologies and methodologies in a fast-paced environment. The person must have the ability to work in a multicultural environment, and have excellent process, functional, communication, teamwork and interpersonal skills, and willingness to work in a team environment to support other technical staff as needed. The person should have high tolerance for ambiguity. The PowerBI / UI Developer needs to be well versed with: Creating complex PowerBI reports and dashboards with advanced data visualization techniques. DAX (Data Analysis Expressions) for creating custom calculations in PowerBI. Power Query for data transformation and manipulation. Data modeling concepts and best practices in PowerBI. Integrating PowerBI with various data sources, including Azure SQL Database and Azure Data Lake. Designing and implementing user-friendly UI/UX for PowerBI dashboards. PowerApps for building custom business applications. Azure Synapse Analytics for handling big data workloads. Azure Data Factory for orchestrating data workflows. Azure Blob Storage for storing and managing large datasets. Implementing row-level security in PowerBI for data protection. Version control systems like Git for managing PowerBI projects. REST APIs for integrating PowerBI with other applications. Power Automate for automating workflows and processes. SQL for querying and managing data in Azure databases. Azure Active Directory for managing user access and authentication. Troubleshooting and optimizing PowerBI performance issues. Designing visually appealing and functional user interfaces and reports.
Posted 1 week ago
3.0 - 6.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Country India Number of Openings* 1 Approved ECMS RQ# 533573 Duration of contract* 6 Months Total Yrs. of Experience* 8+ years Relevant Yrs. of experience* 8+ years Detailed JD *(Roles and Responsibilities) We are looking for a seasoned GCP Engineer with 8 10 years of experience in cloud infrastructure and automation. The ideal candidate will hold a GCP Architecture Certification and possess deep expertise in Terraform, GitLab, Shell Scripting, and a wide range of GCP services including Compute Engine, Cloud Storage, Dataflow, BigQuery, and IAM. You will be responsible for designing, implementing, and maintaining scalable cloud solutions that meet business and technical requirements. Key Responsibilities: > Design and implement secure, scalable, and highly available cloud infrastructure on Google Cloud Platform. > Automate infrastructure provisioning and configuration using Terraform. > Manage CI/CD pipelines using GitLab for efficient deployment and integration. > Develop and maintain Shell scripts for automation and system management tasks. > Utilize GCP services such as Compute Engine, Cloud Storage, Dataflow, BigQuery, and IAM to support data and application workflows. > Ensure compliance with security policies and manage access controls using IAM. > Monitor system performance and troubleshoot issues across cloud environments. > Collaborate with cross-functional teams to understand requirements and deliver cloud-based solutions. Required Skills & Qualifications: > 8 12 years of experience in cloud engineering or infrastructure roles. > GCP Architecture Certification is mandatory. > Strong hands-on experience with Terraform and infrastructure-as-code practices. > Proficiency in GitLab for version control and CI/CD. > Solid experience in Shell Scripting for automation. > Deep understanding of GCP services: Compute Engine, Cloud Storage, Dataflow, BigQuery, and IAM. > Strong problem-solving skills and ability to work independently. > Excellent communication and colloboration skills. Mandatory skills* SQL, SQL Server, BIG Query, , SSIS Desired skills* Data Modelling, ETL Domain* Payments Client name (for internal purpose only)* NatWest Approx. vendor billing rate(INR /Day) 10000 INR/Day Work Location* Chennai or Bangalore or Gurgaon Background check process to be followed: * Yes Before onboarding / After onboarding: * Before Onboarding BGV Agency: * Any Nascom approved Mode of Interview: Telephonic/Face to Face/Skype Interview* Teams virtual followed by F2F WFO / WFH / Hybrid Hybrid Any Certification (Mandatory) As virtual followed by A2A Shift Time Chennai or Bangalore or Gurgaon Business travel required (Yes / No) No Client BTP / SHTP UK
Posted 1 week ago
5.0 - 8.0 years
7 - 8 Lacs
Bengaluru
Work from Office
Were proud to be recognized as a Great Place to Work , a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, you ll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose. Role Overview An Azure Data Engineer specializing in Databricks is responsible for designing, building, and maintaining scalable data solutions on the Azure cloud platform, with a focus on leveraging Databricks and related big data technologies. The role involves close collaboration with data scientists, analysts, and software engineers to ensure efficient data processing, integration, and delivery for analytics and business intelligence needs 2 4 5 . Key Responsibilities Design, develop, and maintain robust and scalable data pipelines using Azure Databricks, Azure Data Factory, and other Azure services. Build and optimize data architectures to support large-scale data processing and analytics. Collaborate with cross-functional teams to gather requirements and deliver data solutions tailored to business needs. Ensure data quality, integrity, and security across various data sources and pipelines. Implement data governance, compliance, and best practices for data security (e. g. , encryption, RBAC). Monitor, troubleshoot, and optimize data pipeline performance, ensuring reliability and scalability. Document technical specifications, data pipeline processes, and architectural decisions Support and troubleshoot data workflows, ensuring consistent data delivery and availability for analytics and reporting Automate data tasks and deploy production-ready code using CI/CD practices Stay updated with the latest Azure and Databricks features, recommending improvements and adopting new tools as appropriate Required Skills and Qualifications Bachelor s degree in computer science, Engineering, or a related field 5+ years of experience in data engineering, with hands-on expertise in Azure and Databricks environments Proficiency in Databricks, Apache Spark, and Spark SQL Strong programming skills in Python and/or Scala Advanced SQL skills and experience with relational and NoSQL databases Experience with ETL processes, data warehousing concepts, and big data technologies (e. g. , Hadoop, Kafka) Familiarity with Azure services: Azure Data Lake Storage (ADLS), Azure Data Factory, Azure SQL Data Warehouse, Cosmos DB, Azure Stream Analytics, Azure Functions Understanding of data modeling, schema design, and data integration best practices Strong analytical, problem-solving, and troubleshooting abilities Experience with source code control systems (e. g. , GIT) and technical documentation tools Excellent communication and collaboration skills; ability to work both independently and as part of a team Preferred Skills Experience with automation, unit testing, and CI/CD pipelines Certifications in Azure Data Engineering or Databricks are advantageous Soft Skills Flexible, self-starter, and proactive in learning and adopting new technologies Ability to manage multiple priorities and work to tight deadlines Strong stakeholder management and teamwork capabilities
Posted 1 week ago
0.0 - 2.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Database design, SQL query optimization, data modeling, and troubleshooting database issues. Responsibilities Database Development and Design: Create, maintain, and optimize SQL databases, including tables, views, stored procedures, and functions. SQL Query Optimization: Write efficient and performant SQL queries, and analyze existing queries for performance improvements. Data Modeling: Design and implement data models to effectively represent and manage data within the database. Troubleshooting and Debugging: Identify and resolve issues with database queries and applications. Collaboration: Work with other developers, business analysts, and stakeholders to understand requirements and integrate database solutions. Database Administration: Perform tasks related to database backup, recovery, and security. Data Analysis and Reporting: Generate and analyze reports from SQL databases to support decision-making. Staying Up-to-Date: Keep abreast of emerging database technologies and best practices. Skills SQL Programming: Proficient in SQL syntax and database management systems (DBMS) like MySQL, Oracle, or Microsoft SQL Server. Data Modeling: Understanding of database design principles and data modeling techniques. SQL Query Optimization: Ability to write and optimize SQL queries for performance. Problem-Solving: Strong analytical and problem-solving skills to troubleshoot database issues. Communication: Ability to communicate effectively with other developers and stakeholders. Experience Proven Experience: In SQL development or related roles. Bachelor's Degree: In Computer Science, Information Technology, or a related field is often preferred. Relevant Work Experience: Experience in SQL development, database design, or database administration.
Posted 1 week ago
0.0 - 2.0 years
8 - 12 Lacs
Bengaluru
Work from Office
We are looking for an experienced data engineer to join our team. You will use various methods to transform raw data into useful data systems. For example, youll create algorithms and conduct statistical analysis. Overall, youll strive for efficiency by aligning data systems with business goals. To succeed in this data engineering position, you should have strong analytical skills and the ability to combine data from different sources. Data engineer skills also include familiarity with several programming languages and knowledge of learning machine methods. Responsibilities Analyze and organize raw data Build data systems and pipelines Interpret trends and patterns Conduct complex data analysis and report on results Prepare data for prescriptive and predictive modeling Build algorithms and prototypes Combine raw information from different sources Explore ways to enhance data quality and reliability Identify opportunities for data acquisition Develop analytical tools and programs Previous experience as a data engineer or in a similar role Technical expertise with data models, data mining, and segmentation techniques Knowledge of programming languages (e.g. Java and Python) Hands-on experience with SQL database design Degree in Computer Science, IT, or similar field; a Masters is a plus Focus will be on building out our Python ETL processes and writing superb SQL Use agile software development processes to make iterative improvements to our back-end systems Model front-end and back-end data sources to help draw a more comprehensive picture of user flows throughout the system and to enable powerful data analysis Build data pipelines that clean, transform, and aggregate data from disparate sources
Posted 1 week ago
5.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Pune, Chennai
Hybrid
Role & responsibilities We are looking for a skilled Data Modeller with 5 to 8 years of hands-on experience in designing and maintaining robust data models for enterprise data solutions. The ideal candidate has a strong foundation in dimensional, relational, and semantic data modelling and is ready to expand into data engineering technologies and practices . This is a unique opportunity to influence enterprise-wide data architecture while growing your career in modern data engineering. Required Skills & Experience: 5 to 8 years of experience in data modelling with tools such as Erwin, ER/Studio, dbt, PowerDesigner , or equivalent. Strong understanding of relational databases, star/snowflake schemas, normalization, and denormalization . Experience working with SQL , stored procedures , and performance tuning of data queries. Exposure to data warehousing concepts and BI tools (e.g., Tableau, Power BI, Looker). Familiarity with data governance , metadata management , and data cataloging tools . Excellent communication and documentation skills.
Posted 1 week ago
5.0 - 9.0 years
10 - 17 Lacs
Chennai
Hybrid
Position Description: Job Title: Salesforce Data Cloud Experience: 5 to 7 Years Job Summary: We are looking for a skilled Salesforce Data Cloud Developer with strong experience in both development and administration. The ideal candidate will be responsible for designing and implementing scalable solutions on Salesforce Data Cloud, managing platform configurations, and working closely with business stakeholders to gather and understand requirements. Key Responsibilities: Design, develop, and deploy custom solutions using Salesforce Data Cloud Perform administrative tasks such as user management, security settings, and data configuration Collaborate with business teams to gather, analyze, and translate requirements into technical solutions Build and maintain data models, integrations, and automation workflows Ensure data integrity, security, and compliance with governance standards Troubleshoot and resolve issues related to performance, data quality, and system behavior Stay updated with Salesforce releases and recommend best practices Required Skills: Strong hands-on experience with Salesforce Data Cloud and core Salesforce platform Solid understanding of data modelling and integration patterns Solid understanding of Data Streams,Data Lake, Data Models,Data Transforms and Data Analysis Experience working with segments, activations, and calculated insights Experience with Salesforce administration tasks and declarative tools Excellent communication skills to interact with business users and translate needs into solutions Salesforce certifications Skills Required: Salesforce Experience Required: 3 to 6 years Experience Preferred: Strong hands-on experience with Salesforce Data Cloud and core Salesforce platform Solid understanding of data modelling and integration patterns Solid understanding of Data Streams,Data Lake, Data Models,Data Transforms and Data Analysis Experience working with segments, activations, and calculated insights Experience with Salesforce administration tasks and declarative tools Excellent communication skills to interact with business users and translate needs into solutions Salesforce certifications Education Required: Bachelor's Degree Additional Information : Key Responsibilities: Design, develop, and deploy custom solutions using Salesforce Data Cloud Perform administrative tasks such as user management, security settings, and data configuration Collaborate with business teams to gather, analyze, and translate requirements into technical solutions Build and maintain data models, integrations, and automation workflows Ensure data integrity, security, and compliance with governance standards Troubleshoot and resolve issues related to performance, data quality, and system behavior Stay updated with Salesforce releases and recommend best practices
Posted 1 week ago
8.0 - 13.0 years
27 - 42 Lacs
Kolkata, Pune, Chennai
Hybrid
Job Description Role requires him/her to design and implement data modeling solutions using relational, dimensional, and NoSQL databases. You will work closely with data architects to design bespoke databases using a mixture of conceptual, physical, and logical data models. Job title Data Modeler Hybrid role from Location: Bangalore, Chennai, Gurgaon, Pune, Kolkata Interviews: 3 rounds of 30 ~ 45 Minutes video-based Teams interviews Employment Type: Permanent Full Time with Tredence Total Experience 9~13 years Required Skills Data Modeling, Dimensional modeling, ErWIN, Data Management, RDBMS, SQL/NoSQL, ETL What we look for: BE/B.Tech or equivalent The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. 9~13 years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks.
Posted 1 week ago
10.0 - 13.0 years
0 - 3 Lacs
Chennai, Bengaluru
Work from Office
Position: Microstrategy Reporting Work Mode: Hybrid Location: Chennai, Bangalore Experience : 10 years Primary Skills: Microstartegy, SQL, ETL Concept/ Data modeling
Posted 1 week ago
5.0 - 10.0 years
35 - 40 Lacs
Bengaluru
Work from Office
As a Senior Data Engineer, you will proactively design and implement data solutions that support our business needs while adhering to data protection and privacy standards. In addition to this, you would also be required to manage the technical delivery of the project, lead the overall development effort, and ensure timely and quality delivery. Responsibilities : Data Acquisition : Proactively design and implement processes for acquiring data from both internal systems and external data providers. Understand the various data types involved in the data lifecycle, including raw, curated, and lake data, to ensure effective data integration. SQL Development : Develop advanced SQL queries within database frameworks to produce semantic data layers that facilitate accurate reporting. This includes optimizing queries for performance and ensuring data quality. Linux Command Line : Utilize Linux command-line tools and functions, such as bash shell scripts, cron jobs, grep, and awk, to perform data processing tasks efficiently. This involves automating workflows and managing data pipelines. Data Protection : Ensure compliance with data protection and privacy requirements, including regulations like GDPR. This includes implementing best practices for data handling and maintaining the confidentiality of sensitive information. Documentation : Create and maintain clear documentation of designs and workflows using tools like Confluence and Visio. This ensures that stakeholders can easily communicate and understand technical specifications. API Integration and Data Formats : Collaborate with RESTful APIs and AWS services (such as S3, Glue, and Lambda) to facilitate seamless data integration and automation. Demonstrate proficiency in parsing and working with various data formats, including CSV and Parquet, to support diverse data processing needs. Key Requirements: 5+ years of experience as a Data Engineer , focusing on ETL development. 3+ years of experience in SQL and writing complex queries for data retrieval and manipulation. 3+ years of experience in Linux command-line and bash scripting. Familiarity with data modelling in analytical databases. Strong understanding of backend data structures, with experience collaborating with data engineers ( Teradata, Databricks, AWS S3 parquet/CSV ). Experience with RESTful APIs and AWS services like S3, Glue, and Lambda Experience using Confluence for tracking documentation. Strong communication and collaboration skills, with the ability to interact effectively with stakeholders at all levels. Ability to work independently and manage multiple tasks and priorities in a dynamic environment. Bachelors degree in Computer Science, Engineering, Information Technology, or a related field. Good to Have: Experience with Spark Understanding of data visualization tools, particularly Tableau. Knowledge of data clean room techniques and integration methodologies.
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Bengaluru
Work from Office
Data Modeler -SA Role Overview We are looking for an experienced Data Modeler with a strong foundation in dimensional data modeling and a proven ability to design and maintain conceptual, logical, and physical data models. The ideal candidate will have a minimum of 5+ years of experience in data modeling and architecture, preferably within the banking or financial services industry. Key Responsibilities Design, develop, and maintain dimensional data models to support analytics and reporting. Design conceptual, logical, and physical data models Utilize AWS services for scalable data model design Align data models with business rules and governance standards. Collaborate with business stakeholders, data architects, and engineers to ensure data models align with business rules and data governance standards. Translate business requirements into scalable and efficient data models. Maintain comprehensive documentation for data models, metadata, and data dictionaries. Ensure consistency and integrity of data models across systems and platforms. Partner with data engineering teams to implement models in AWS-based environments, including Redshift, Glue, and Lake Formation. Required Skills and Qualifications 5+ years of experience in data modeling, with a focus on dimensional modeling and data warehouse design. Proficiency in developing conceptual, logical, and physical data models. Strong understanding of data governance, data quality, and metadata management. Hands-on experience with AWS services such as Redshift, Glue, and Lake Formation. Familiarity with data modeling tools (e.g., ER/Studio, ERwin, or similar). Excellent communication skills and ability to work with cross-functional teams. Preferred Qualifications Experience in the banking or financial services sector. Knowledge of data lake architecture and modern data stack tools. AWS or data modeling certifications are a plus.
Posted 1 week ago
7.0 - 12.0 years
25 - 27 Lacs
Pune, Bengaluru
Hybrid
Role: Data Analyst / Senior Data Analyst Experience: 7+ years Location: Bangalore/ Pune Responsibilities: Define and obtain source data required to successfully deliver insights and use cases. Determine the data mapping required to join multiple data sets together across multiple sources Create methods to highlight and report data inconsistencies, allowing users to review and provide feedback on Propose suitable data migration sets to the relevant stakeholders Assist teams with processing the data migration sets as required Assist with the planning, tracking and coordination of the data migration team and with the migration run-book and the scope for each customer Role Requirements: Strong Data Analyst with Financial Services experience. Knowledge of and experience using data models and data dictionaries in a Banking and Financial Markets context Knowledge of one or more of the following domains (including market data vendors): Party/Client Trade Settlements Payments Instrument and pricing Market and/or Credit Risk Demonstrate a continual desire to implement strategic” or “optimal” solutions and where possible, avoid workarounds or short term tactical solutions. Working with stakeholders to ensure that negative customer and business impacts are avoided Manage stakeholder expectations and ensure that robust communication and escalation mechanisms are in place across the project portfolio Good understanding of the control requirement surrounding data handling Experience/Skillset Must Have: Excellent analytical skills and commercial acumen. Proficient in Python, Pyspark and SQL. Good understanding of the control requirements surrounding data handling Experience of big data programmes preferable Strong verbal and written communication skills Strong self-starter with strong change delivery skills who enjoys the challenge of delivering change within tight deadlines Ability to manage multiple priorities Business analysis skills, defining and understanding requirements Knowledge of and experience using data models and data dictionaries in a Banking and Financial Markets context Can write SQL queries and navigate data bases especially Hive, CMD, Putty, Note++ Enthusiastic and energetic problem solver to join an ambitious team Good knowledge of SDLC and formal Agile processes, a bias towards TDD and a willingness to test products as part of the delivery cycle Ability to communicate effectively in a multi-programme environment across a range of stakeholders Attention to detail Good to have: Preferable knowledge and experience in Data Quality & Governance For Spark Scala - should have working experience using scala (preferable) or java for spark For Senior DAs: proven track record of managing small delivery-focussed data teams
Posted 1 week ago
5.0 - 10.0 years
20 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Job Summary We are seeking a skilled Data Modeler to join our data management team. The Data Modeler will design, implement, and maintain conceptual, logical, and physical data models to support business intelligence, analytics, and operational systems. This role involves collaborating with cross-functional teams to ensure data models align with organizational goals, optimize data storage and retrieval, and maintain data integrity and consistency. The ideal candidate will have strong technical expertise in data modeling tools, database management systems, and a deep understanding of business processes. . Years of experience needed – 5-12 years of hands-on experience in data modeling, including conceptual, logical, and physical Experience with data warehousing, ETL processes, and business intelligence environments is preferred. . Technical Skills: Proficiency in data modeling tools such as ER/Studio, ERwin, PowerDesigner, Lucidchart, or IBM InfoSphere Data Architect. Strong knowledge of relational database management systems (RDBMS) like SQL Server, Oracle, MySQL, PostgreSQL, or NoSQL databases Familiarity with SQL, T-SQL, Python, or other programming languages for data manipulation and automation.glassdoor.comzippia.com Experience with data warehousing concepts, ETL processes, and dimensional modeling (e.g., star/snowflake schemas). Understanding data governance, metadata management, and data quality practices. Design and Develop Data Models: Create conceptual, logical, and physical data models to support business applications, analytics, and reporting requirements. Use modeling techniques such as Entity-Relationship (ER) diagrams, UML, or dimensional modeling Collaborate with Stakeholders: Work with business analysts, data architects, and other stakeholders to gather and analyze data requirements and translate them into effective data structures. Optimize Data Systems: Evaluate and optimize existing data models for performance, scalability, and usability, ensuring reduced data redundancy and efficient data flows. Maintain Data Integrity: Implement data governance practices, including defining naming conventions, standards, and metadata management to ensure consistency, accuracy, and security of data. Development and Document: Create and maintain data dictionaries, metadata repositories, and documentation for data models to ensure clarity and accessibility across the organization. Support Data Integration: Collaborate with ETL developers, data engineers, and database administrators to design data flows, source-to-target mappings, and integration processes. Troubleshoot and Enhance: Analyze and resolve data model performance issues, conduct data quality assessments, and recommend improvements to data architecture and processes. Stay Current: Keep up to date with industry trends, best practices, and emerging technologies in data modeling, database management, and analytics.usebraintrust.com Qualification: Education: Master’s or bachelor’s degree in computer science, Information Systems, Data Science, Applied Mathematics, or a related field.
Posted 1 week ago
3.0 - 5.0 years
5 - 9 Lacs
Mumbai
Work from Office
Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Do 1. Customer requirements gathering and engagement Interface and coordinate with client engagement partners to understand the RFP/ RFI requirements Detail out scope documents, functional & non-functional requirements, features etc ensuring all stated and unstated customer needs are captured Construct workflow charts and diagrams, studying system capabilities, writing specification after thorough research and analysis of customer requirements Engage and interact with internal team - project managers, pre-sales team, tech leads, architects to design and formulate accurate and timely response to RFP/RFIs Understand and communicate the financial and operational impact of any changes Periodic cadence with customers to seek clarifications and feedback wrt solution proposed for a particular RFP/ RFI and accordingly instructing delivery team to make changes in the design Empower the customers through demonstration and presentation of the proposed solution/ prototype Maintain relationships with customers to optimize business integration and lead generation Ensure ongoing reviews and feedback from customers to improve and deliver better value (services/ products) to the customers 2.Engage with delivery team to ensure right solution is proposed to the customer a.Periodic cadence with delivery team to: Provide them with customer feedback/ inputs on the proposed solution Review the test cases to check 100% coverage of customer requirements Conduct root cause analysis to understand the proposed solution/ demo/ prototype before sharing it with the customer Deploy and facilitate new change requests to cater to customer needs and requirements Support QA team with periodic testing to ensure solutions meet the needs of businesses by giving timely inputs/feedback Conduct Integration Testing and User Acceptance demos testing to validate implemented solutions and ensure 100% success rate Use data modelling practices to analyze the findings and design, develop improvements and changes Ensure 100% utilization by studying systems capabilities and understanding business specifications Stitch the entire response/ solution proposed to the RFP/ RFI before its presented to the customer b.Support Project Manager/ Delivery Team in delivering the solution to the customer Define and plan project milestones, phases and different elements involved in the project along with the principal consultant Drive and challenge the presumptions of delivery teams on how will they successfully execute their plans Ensure Customer Satisfaction through quality deliverable on time 3.Build domain expertise and contribute to knowledge repository Engage and interact with other BAs to share expertise and increase domain knowledge across the vertical Write whitepapers/ research papers, point of views and share with the consulting community at large Identify and create used cases for a different project/ account that can be brought at Wipro level for business enhancements Conduct market research for content and development to provide latest inputs into the projects thereby ensuring customer delight Mandatory Skills: Business Analysis. Experience: 3-5 Years.
Posted 1 week ago
8.0 - 10.0 years
5 - 9 Lacs
Navi Mumbai
Work from Office
Candidate should have 8 to 10 years of total experience in Storage & Backup Domain Technology Able to provide consultancy and recommendation on storage in the below mentioned areas: Recommend definition and assignment of tier profiles based on their performance, availability, recoverability, and serviceability characteristics. Recommend application data placement on storage tiers per profiles. Recommend tiering and archival approaches based on aging, I/O, access, and usage. Recommend thin provisioning approach. Recommend best practices for backup and restore Recommend file system capacity standards, replication systems, and archiving Recommend Storage compaction and de-duplication capabilities to reduce the Storage footprint. Recommend file system folder management. Conduct periodic tests to validate the integrity of the data replication solutions such as failover test to the replicated system and validate functionality. Update Asset Inventory database in the CMDB (Asset Management tool provisioned), in case of hardware part replacement by following approved Change management process.
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
NTT DATA is looking for a Business Consulting- Functional DB consultant to join the team in Pune, Maharashtra (IN-MH), India. As a part of our inclusive and forward-thinking organization, you will be responsible for various key tasks related to database management in the domain of Capital Markets-Wealth Management. Your primary responsibilities will include gathering and analyzing requirements from stakeholders to translate them into database needs. You will design and implement database schemas, data models, and structures to support business processes effectively. Additionally, you will focus on optimizing database performance, efficiency, and scalability through various techniques. Data migration, integration, security maintenance, troubleshooting, documentation, and collaboration with technical teams will also be crucial aspects of your role. To excel in this position, you must have strong expertise in database concepts, design principles, and various systems like Oracle, SQL Server, and PostgreSQL. Proficiency in SQL, data modeling, business acumen, communication, problem-solving skills, and experience with cloud-based technologies (AWS, Azure, Google Cloud) will be highly beneficial. Project management skills to lead database projects from planning to execution will also be required. NTT DATA is a trusted global innovator of business and technology services, serving Fortune Global 100 companies with a commitment to innovation and long-term success. With a diverse team and a wide range of services including consulting, data and artificial intelligence, industry solutions, and application management, we are dedicated to helping organizations navigate the digital future confidently and sustainably. Join us in our mission to innovate, optimize, and transform for success. Visit us at us.nttdata.com.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
The Technical Architect plays a crucial role in supporting the full implementation lifecycle within an evolving ecosystem of clients and partners. As a Technical Architect, you will be responsible for designing and building Salesforce industry-specific solutions for the Telco and/or Media Industries, particularly utilizing Salesforce Communication and/or Media Cloud. You will maintain a comprehensive understanding of the cloud-computing ecosystem and become a product expert in Salesforce Industry Communication Cloud and Media Cloud applications. Collaborating closely with internal teams and clients, you will architect technology solutions to meet client needs, ensuring that software products are leveraged correctly and adhere to best practices. Your responsibilities will include working with Delivery Managers, Functional Solution Architects, and development staff to design technology solutions, collaborating with internal stakeholders to maximize customer value, leading and mentoring development teams, and being directly involved in the low-level design, development, and support of Salesforce-related projects. In addition, you will be responsible for hands-on application configuration and customization, developing proof of concepts, creating detailed design documentation using UML diagrams, ensuring that systems meet business unit expectations, and delivering CRM and workflow solutions using Salesforce/Apex, Visualforce, Lightning, LWC, and J2EE technologies. As a Technical Architect, you will advocate for and implement best practice development methodologies, maintain awareness of Salesforce Industries products, learn new technologies, and contribute to internal initiatives to grow the consulting practice. This role may involve travel to customer locations. To be successful in this role, you should have over 10 years of experience in developing technology solutions, collaborating effectively within development teams, and preferably utilizing Agile development techniques. You should have extensive experience and understanding of the Communication and/or Media Sector, with a proven track record of successful design and implementation of customer projects. Technical skills required for this role include hands-on experience with Salesforce Communication Cloud and/or Media Cloud modules, Salesforce/Apex, Apex Design Patterns, Triggers, Workflow Alerts and Actions, Process Builders, Visualforce, Lightning, LWC, data modeling, and process modeling tools. Additionally, experience with platform security, data architecture and management, architectural design patterns, DevOps, and release management is essential. Desired certifications and qualifications for this role include Salesforce Industries Omnistudio Developer, Salesforce Industries CPQ Developer, Salesforce Industries Media Cloud Accredited Professional, Salesforce Certifications (Admin, PD1, and PD2), Integration Architecture, Identity and Access Management, and Data Architecture and Management. A solid understanding of Communication and/or Media industry regulations and certifications such as Certified Scrum Master or Certified Product Owner are also preferred. A Bachelor's or Master's degree in Computer Science, Software Engineering, Business, or a related field is desirable.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
NTT DATA is looking for a Salesforce Developer- Agentforce,Experience cloud to join their team in Pune, Maharashtra (IN-MH), India. As a Senior Salesforce Developer, your responsibilities will include leading the design, development, and implementation of AI-driven solutions using Salesforce Einstein features for Sales & Experience cloud. You will collaborate with architects and AI engineers to design solutions for predictive analytics, customer insights, and automation. Integrating third-party applications and data sources with Salesforce via APIs will be a key aspect of your role. You will work closely with product managers, data scientists, and developers to ensure successful implementation of AI/ML solutions aligned with business goals. This includes implementing automated workflows and intelligent data-driven decision systems within Sales Cloud and Experience Cloud. You will be responsible for designing and developing scalable solutions in Salesforce Agentforce and Experience Cloud environments, building and customizing Lightning components, Apex classes, Visualforce pages, Flows, and APIs to support agent and customer portals. Furthermore, you will design and implement seamless integrations with core systems such as policy administration, claims, billing, and document management via REST/SOAP APIs, middleware (e.g., MuleSoft, Boomi), or event-driven architectures. Developing custom components, templates, and branded sites within Experience Cloud to support self-service and agent experiences will also be part of your responsibilities. Gathering requirements from stakeholders and translating them into technical solutions, maintaining and optimizing data models, sharing rules, and user roles for different Experience Cloud personas, and ensuring best practices in coding, testing, and DevOps deployment strategies are also key aspects of the role. Troubleshooting production issues, providing ongoing support for Salesforce-based applications, participating in Agile ceremonies, and maintaining technical documentation are part of the responsibilities as well. To be successful in this role, you should have 7+ years of overall experience in Salesforce development with at least 2+ years of hands-on experience in Salesforce AI Solutions development. Expertise in Salesforce Einstein AI features, customizing applications using Apex, LWC, Lightning Flows, data modeling, data integration using REST APIs, handling large datasets within Salesforce, machine learning algorithms, Salesforce architecture, NLP, image recognition, sentiment analysis, AI governance, model explainability, data ethics, and leveraging structured and unstructured data for ML and NLP solutions are required qualifications. Preferred skills include Salesforce certifications in Einstein Analytics and Discovery Consultant, Salesforce Platform Developer II, or Salesforce Certified AI Associate, experience in Salesforce Data cloud, Salesforce Agentforce, Apex, Lightning Web Components (LWC), SOQL, Flow Builder, integrating Salesforce with external systems, designing and managing Experience Cloud sites, and Salesforce certifications like Platform Developer I/II, Experience Cloud Consultant, Integration Architecture Designer. Join NTT DATA, a trusted global innovator of business and technology services, and be part of a diverse team committed to helping clients innovate, optimize, and transform for long-term success. Visit us at us.nttdata.com.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Software Engineer at Carelon Global Solutions India, you will play a crucial role in our team as an AWS and Snowflake Developer. Reporting to the Team Lead, your primary responsibility will involve the development and maintenance of our data infrastructure, ensuring its optimal performance, and supporting data analytics initiatives. Your key responsibilities will include: - Designing, developing, and maintaining scalable data pipelines and ETL processes using AWS services and Snowflake. - Implementing data models, data integration, and data migration solutions. - Working on Scala to Snowpark conversion. - Experience in Cloud migration projects would be advantageous. - Hands-on experience with AWS Services such as Lambda, Step functions, Glue, and S3 buckets. - Certification in Python is a plus. - Knowledge of Job Metadata and ETL Step Metadata Creation, Migration, and Execution. - Expertise in Snowflake. - Familiarity with Elevance Health OS would be a plus. Qualifications: - Full-time IT Engineering or equivalent degree, preferably in Computers. Experience: - Minimum of 2 years as an AWS, Snowflake, Python Developer. Skills and Competencies: - Design, develop, and maintain scalable data pipelines and ETL processes using AWS services and Snowflake. - Manage and optimize Snowflake environments for efficient performance and cost-effectiveness. - Collaborate with data analysts, data scientists, and stakeholders to deliver solutions meeting business needs. - Monitor and optimize the performance of data pipelines and Snowflake queries. - Ensure data security and compliance with regulations and standards. - Proficiency in SQL, Python, or Scala, data modeling, data integration tools, and ETL processes. - Experience with version control systems like Git and CI/CD pipelines. At Carelon, we offer a world of limitless opportunities to our associates, focusing on learning and development, innovation, well-being, rewards, and recognition. Our inclusive culture celebrates diversity and different ways of working. If you require reasonable accommodation due to a disability during the interview process, feel free to request it. Join us as a Software Engineer at Carelon Global Solutions and be part of a team committed to improving lives, simplifying healthcare, and expecting more.,
Posted 1 week ago
4.0 - 12.0 years
0 Lacs
karnataka
On-site
This is a full-time hybrid role for an Oracle VBCS Developer based in Bengaluru, with some work-from-home flexibility. As an Oracle VBCS Developer, you will be responsible for developing and maintaining applications using Oracle Visual Builder Cloud Service (VBCS). Your duties will include creating data models, managing databases, producing Oracle Reports, and performing ETL (Extract Transform Load) tasks. Collaboration with cross-functional teams is essential to ensure that software development projects meet business needs. You will work for an MNC client in a permanent role with a hybrid work model across India. The ideal candidate should have 4 to 12 years of experience and be proficient in Oracle Visual Builder Cloud Service (VBCS). You must have a minimum of 4 years of relevant experience in VBCS and Redwood. Proficiency in Visual Builder/Oracle JET, Java Script, REST API, HTML, CSS, SQL is required. Functional knowledge in CRM, SFA, and PRM is necessary. Additionally, you should have expertise in configuring business processes, security mechanisms like job roles, and the usage of different authentication mechanisms such as Bearer Token, Basic Auth, OAuth, and JWT. Experience in OICS (Oracle Integration Cloud) and Redwood will be considered an advantage. You should be capable of reviewing existing configurations and scripts and creating functional/technical documents. Skills in Data Modeling, Databases, and the ability to create and manage Oracle Reports are crucial. Excellent communication and teamwork skills are essential for this role. A Bachelor's degree in Computer Science, Information Technology, or a related field is required. Previous experience in the IT services industry is advantageous.,
Posted 1 week ago
2.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As an experienced IT professional with a passion for data and technology, your role will involve ensuring that data accurately reflects business requirements and targets. Collaborating closely with the Procurement & Logistic department and external providers in an agile environment, you will leverage your deep understanding of technology stack capabilities to facilitate engagements and solve impediments for delivering data use cases to drive business value and contribute to the vision of becoming a data-driven company. You will play a crucial role in the energy transformation at Siemens Energy ABP Procurement team, working alongside a diverse team of innovative and hardworking data enthusiasts and AI professionals. Your impact will be significant, with responsibilities including service operation and end-to-end delivery management, interacting with business users and key collaborators, developing and maintaining data architecture and governance standards, designing optimized data architecture frameworks, providing guidance to developers, ensuring data quality, and collaborating with various functions to translate user requirements into technical specifications. To excel in this role, you should bring 8 to 10 years of IT experience with a focus on ETL tools and platforms, proficiency in Snowflake SQL Scripting, JavaScript, PL/SQL, and data modeling for relational databases. Experience in data warehousing, data migration, building data pipelines, and working with AWS, Azure & GCP data services is essential. Additionally, familiarity with Qlik, Power BI, and a degree in computer science or IT are preferred. Strong English skills, intercultural communication abilities, and a background in international collaboration are also key requirements. Joining the Value Center ERP team at Siemens Energy, you will be part of a dynamic group dedicated to driving digital transformation in manufacturing and contributing to the achievement of Siemens Energy's objectives. This role offers the opportunity to work on innovative projects that have a substantial impact on the business and industry, enabling you to be a part of the energy transition and the future of sustainable energy solutions. Siemens Energy is a global leader in energy technology, with a commitment to sustainability and innovation. With a diverse team of over 100,000 employees worldwide, we are dedicated to meeting the energy demands of the future in a reliable and sustainable manner. By joining Siemens Energy, you will contribute to the development of energy systems that drive the energy transition and shape the future of electricity generation. Diversity and inclusion are at the core of Siemens Energy's values, celebrating uniqueness and creativity across over 130 nationalities. The company provides employees with benefits such as Medical Insurance and Meal Card options, supporting a healthy work-life balance and overall well-being. If you are ready to make a difference in the energy sector and be part of a global team committed to sustainable energy solutions, Siemens Energy offers a rewarding and impactful career opportunity.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a Data Architect, you will play a crucial role in designing, implementing, and maintaining scalable data architecture solutions aligned with business objectives. You will collaborate with cross-functional teams to translate business requirements into data models and architecture, ensuring data quality, integrity, security, and compliance across all systems. Your expertise in data modeling, ETL processes, and big data solutions will be essential in integrating disparate data sources for seamless data flow and access. Your responsibilities will include developing and maintaining data pipelines, creating complex data visualizations, and recommending tools to enhance the overall data strategy. You must possess a Bachelor's degree in Computer Science or a related field, along with relevant IT certifications in data management or software architecture. With a minimum of 5 years of hands-on experience in software design and data architecture projects, you should have a background in roles such as Database Administrator, Data Analyst, or Data Engineer. Proficiency in data modeling techniques, ETL tools, programming languages like Python or Java, and business intelligence tools such as Tableau and Power BI is required. Familiarity with big data technologies, Agile methodologies, and data security principles is essential. Your strong analytical and problem-solving abilities, along with excellent communication and stakeholder management skills, will be key to your success in this role. You should be ready to work onsite in Bengaluru for an initial 3-month engagement, with the possibility of a 6-month contract extension. Your ability to work in a day shift from Monday to Friday, along with morning shifts, will be crucial. If you possess the required qualifications, experience, and skills mentioned above and are eager to take on this challenging role, we look forward to receiving your application.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough