Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 15.0 years
0 Lacs
karnataka
On-site
Job Description: With over 24 years of experience in building innovative digital products and solutions, Tavant is a frontrunner in driving digital innovation and tech-enabled transformation across various industries in North America, Europe, and Asia-Pacific. Powered by Artificial Intelligence and Machine Learning algorithms, Tavant helps customers improve operational efficiency, productivity, speed, and accuracy. The challenging workplace at Tavant fosters diverse and competitive teams that are constantly seeking tomorrow's technology and bright minds to create it. The company values not only what is done but also how it is done, encouraging individuals to bring their talent and ambition to make a difference. Tavant Technologies is currently looking for a highly skilled Data Quality Technical Architect with expertise in Informatica's Intelligent Data Management Cloud (IDMC). The ideal candidate will be responsible for designing, implementing, and maintaining data quality solutions to ensure the integrity, accuracy, and reliability of data across various systems and platforms. As a Data Quality Technical Architect, you will collaborate closely with data engineers, data analysts, and business stakeholders to define data quality standards and best practices, leading initiatives that enhance data management processes. Key Responsibilities: - Design and implement data quality frameworks and solutions using Informatica IDMC. - Develop data quality rules and metrics to assess the integrity and reliability of data. - Collaborate with cross-functional teams to identify and address data quality issues. - Create and maintain documentation for data quality processes and standards. - Provide guidance and mentorship to junior data quality engineers and architects. - Monitor data quality performance and report findings to stakeholders. - Stay up-to-date with industry trends and advancements in data quality technologies and practices. Skills and Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - Proven experience in data quality management and data governance. - Strong expertise in Informatica IDMC and its data quality capabilities. - Proficiency in SQL and experience with relational databases. - Familiarity with data integration tools and ETL processes. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal skills to collaborate with various teams. - Experience with data profiling, cleansing, and validation techniques. Tools Required: - Informatica Intelligent Data Management Cloud (IDMC) - SQL and database management systems (e.g., Oracle, SQL Server, MySQL) - Data profiling tools - Data visualization tools (e.g., Tableau, Power BI) - ETL tools and processes - Data governance frameworks and tools If you are passionate about ensuring data quality and possess the technical expertise to drive data quality initiatives, we welcome you to apply for the Data Quality Technical Architect position at Tavant Technologies.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
Are you ready to experiment, learn, and implement Then, this is your place to be. Join us on a new adventure where your expertise in SQL Server, SSIS, ETL processes, and data integration can change the dynamics of our organization and revolutionize the paradigm. We are waiting for you because we stand for selection, not rejection. OptiSol is your answer to a stress-free, balanced lifestyle. We are your home away from home, where a career is nurtured at both ends. Being certified as a GREAT PLACE TO WORK for 4 consecutive years, we are known for our culture. We believe in open communication and accessible leadership. We celebrate diversity, promote work-life balance with flexible policies, where you can thrive personally and professionally. Solid with SQL Server and ETL-SSIS, making data moves easy. Experienced in data migration, ensuring everything transitions smoothly. Great at handling clients and leading teams, with strong communication skills to keep everyone on the same page. What do we expect SQL Server knowledge is a must-have, it's the foundation. Oracle databases experience is a nice bonus but not a dealbreaker. SSIS is a must for handling ETL processes smoothly. Familiar with Power BI Great, but not a requirement. Have some experience with data analysis, profiling, and quality assurance Perfect. Strong collaboration skills to work well with cross-functional teams and stakeholders. What You'll Bring to the Table: - Analyze and profile data from both source and target systems to spot any gaps, inconsistencies, or areas needing transformation. - Design and carry out strategies for data mapping, cleansing, transformation, and validation. - Implement and keep track of ETL processes for data migration using SSIS and other tools. - Dive deep into data systems from acquired companies, identifying overlaps and areas for improvement during integration. - Work with tech teams to move, clean, and integrate data into SQL Server, streamlining everything. - Develop and apply strategies to validate data, ensuring its accuracy and completeness post-migration. - Set up and monitor data quality KPIs, making sure everything stays on track during and after migration. - Communicate findings and progress clearly to stakeholders, creating reports using tools like Power BI when needed. Core benefits you'll gain: - Hands-on opportunity to work with ETL tools and data migration processes, boosting your technical expertise. - Gain valuable experience in system integration and optimizing data management across different platforms. - Improve your collaboration skills by working closely with business teams and technical experts to meet project goals. - Strengthen your problem-solving and data analysis skills, ensuring smooth migrations and high data quality. Apply and take your seat with us as an OptiSolite!,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a SQL Database Developer, your primary responsibility will be to develop, optimize, and maintain MSSQL databases to support business operations. You will design, build, and manage cloud database solutions to ensure efficient data storage and retrieval. Data migration between on-premise and cloud databases will be a crucial aspect of your role, where you will be required to ensure data integrity and security throughout the process. Your expertise in developing and optimizing complex SQL queries, stored procedures, functions, and triggers will be essential for data analysis and reporting purposes. Additionally, you will work on implementing reporting automation solutions using BI tools, SQL, and scripting techniques to enhance data-driven decision-making within the organization. Collaborating with business stakeholders to understand data requirements and provide actionable insights will be a key part of your daily tasks. With over 10 years of experience in SQL development, database management, and data analysis, you will be expected to demonstrate proficiency in MSSQL Server development, including query optimization and performance tuning. Your hands-on experience with cloud databases such as AWS RDS, Azure SQL, or Google Cloud SQL will be highly valuable. A strong understanding of data migration strategies, automated reporting solutions, ETL processes, and data warehousing concepts is essential for success in this role. Your ability to analyze large datasets, attention to detail, and problem-solving skills will contribute to ensuring data quality, governance, and security through the implementation of best practices. Effective communication skills and the capacity to work independently under strict deadlines while managing multiple tasks will be key attributes that you will bring to the team. If you are passionate about working with structured and unstructured business data efficiently and providing meaningful insights, this position offers a dynamic opportunity to utilize your skills and experience effectively.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
kochi, kerala
On-site
As a skilled professional in the field of technical program management and data analytics, you will play a crucial role in managing large-scale technical programs encompassing data warehousing, ETL processes, BI reporting (Power BI/MSBI), and cloud migration projects. Your responsibilities will involve defining, recommending, and enforcing technical frameworks and architectures that align with the goals of customers and the organization. You will guide solution teams in design, development, and deployment best practices to ensure scalability, security, and performance of the solutions. In this role, you will oversee requirement gathering, validate technical specifications, and customize solutions to meet the specific business needs. Integration efforts across multiple technology stacks and platforms will be under your purview to ensure interoperability and data consistency. Implementing robust governance, risk management processes such as RAID logs, and quality assurance practices for technical delivery will be essential in ensuring project success. Additionally, you will be responsible for managing technical resource planning, estimation, and utilization in alignment with project objectives and financial targets. Collaboration with technology officers and stakeholders from client organizations will be necessary to align on solution roadmaps and adoption strategies. Your role will also involve driving continuous innovation through the adoption of emerging tools, automation, and cloud-native technologies. Monitoring project health through key performance indicators such as delivery timelines, defect rates, system performance, and customer satisfaction will be a critical aspect of your responsibilities. Your technical expertise should include a strong understanding of Data Warehousing concepts, ETL processes, and analytics platforms. Hands-on experience with Power BI, MSBI (SSIS, SSAS, SSRS), and modern BI toolchains will be required. Proficiency in cloud platforms such as Azure, AWS, or GCP, along with knowledge of cloud migration strategies, is essential. Designing scalable data architectures, data lakes, familiarity with Agile, DevOps, CI/CD pipelines, and automation frameworks are skills that you should possess. An understanding of API integrations, microservices, and modern software engineering best practices is crucial. You should also be capable of evaluating and integrating third-party tools and accelerators to optimize solutions effectively. Your role will be instrumental in driving technical excellence, ensuring project success, and fostering innovation in the dynamic field of data analytics and technical program management.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
Join a dynamic leader in the cloud data engineering sector, specializing in advanced data solutions and real-time analytics for enterprise clients. This role offers an on-site opportunity in India to work on cutting-edge AWS infrastructures where innovation is at the forefront of business transformation. The ideal candidate for this role is a professional with 4+ years of proven experience in AWS data engineering, Python, and PySpark. You will play a crucial role in designing, optimizing, and maintaining scalable data pipelines that drive business intelligence and operational efficiency. As part of your responsibilities, you will design, develop, and maintain robust AWS-based data pipelines using Python and PySpark. You will implement efficient ETL processes, ensuring data integrity and optimal performance across AWS services such as S3, Glue, EMR, and Redshift. Collaboration with cross-functional teams to integrate data engineering solutions within broader business-critical applications will be a key aspect of your role. Additionally, you will troubleshoot and optimize existing data workflows to ensure high availability, scalability, and security of cloud solutions. It is essential to exercise best practices in coding, version control, and documentation to maintain a high standard of engineering excellence. The required skills and qualifications for this role include 4+ years of hands-on experience in AWS data engineering with expertise in Python and PySpark. Proficiency in developing and maintaining ETL processes using AWS services like S3, Glue, EMR, and Redshift is a must. Strong problem-solving skills, deep understanding of data modeling, data warehousing concepts, and performance optimization are essential. Preferred qualifications include experience with AWS Lambda, Airflow, or similar cloud orchestration tools, familiarity with containerization, CI/CD pipelines, and infrastructure-as-code like CloudFormation and Terraform, as well as AWS certifications or equivalent cloud credentials. In this role, you will work in a collaborative, fast-paced environment that rewards innovation and continuous improvement. You will have opportunities for professional growth and skill development through ongoing projects and training. Additionally, you will benefit from competitive compensation and the ability to work on transformative cloud technology solutions.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Kinaxis RapidResponse Consultant in the enterprise supply-chain software and consulting sector, you will be responsible for leading end-to-end RapidResponse implementations on-site. This includes tasks such as requirements gathering, solution design, configuration, testing, cutover, and hypercare for S&OP/IBP, demand & supply planning use-cases. Your role will involve designing and building RapidResponse models, rulesets, schedules, and scenario-based what-if engines, as well as translating business processes into robust data and planning models. You will also be expected to develop and validate data integration pipelines, which includes ETL/data mapping, SQL transforms, flat-file and API-based integrations (REST/SOAP), ensuring end-to-end data quality and lineage. Additionally, creating analytics, dashboards, KPIs inside RapidResponse, implementing alerts, exception management, and executive reporting to facilitate faster decision-making will be part of your responsibilities. Furthermore, you will drive testing and deployment activities, including unit, system, and UAT testing, deliver training, user documentation, and provide on-site go-live support and hypercare. Collaboration with cross-functional client teams, mentoring junior consultants, recommending continuous improvements, and best practices are vital aspects of this role. To be successful in this position, you must have at least 4+ years of hands-on Kinaxis RapidResponse implementation experience, a strong understanding of supply-chain domains, proficiency in SQL, data mapping, and ETL processes, client-facing consulting experience, and a bachelor's degree in Engineering, Supply Chain, Computer Science, Business, or a related field. Additionally, preferred qualifications include Kinaxis RapidResponse certification or formal training, familiarity with Kinaxis adapters/MDS, JavaScript within RapidResponse, and Python-based automation for data processing. In return, you can expect competitive compensation with opportunities for certification, professional development, and rapid career progression within supply-chain consulting. This role offers high-impact, client-facing responsibilities in a fast-paced transformation practice, providing exposure to large-scale global projects and senior stakeholders. You will work in a collaborative, mentor-driven environment that values practical problem-solving and continuous improvement. If you are an experienced RapidResponse implementation professional with availability for on-site assignments in India, please submit your CV highlighting Kinaxis RapidResponse projects, supply-chain domain experience, and availability for consideration. Shortlisted candidates will be contacted for an initial technical and behavioral interview.,
Posted 1 month ago
12.0 - 16.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Data Analytics Lead, you will be responsible for overseeing the design, development, and implementation of data analysis solutions to meet business needs. Working closely with business stakeholders and the Aviation Subject Matter Expert (SME), you will define data requirements, project scope, and deliverables. Your role will involve driving the design and development of analytics data models and data warehouse designs, as well as developing and maintaining data quality standards and procedures. In this position, you will manage and prioritize data analysis projects to ensure timely completion. You will also be expected to identify opportunities to improve data analysis processes and tools, collaborating with Data Engineers and Data Architects to ensure that data solutions align with the overall data platform architecture. Additionally, you will evaluate and recommend new data analysis tools and technologies, contributing to the development of best practices for data analysis. Participating in project meetings, you will provide input on data-related issues, risks, and requirements. The ideal candidate for this role should have at least 12 years of experience as a Data Analytics Lead, with a proven track record of leading or mentoring a team. Extensive experience with cloud-based data modeling and data warehousing solutions, particularly using Azure Data Bricks, is required. Proficiency in data visualization tools such as Power BI is also essential. Furthermore, the role calls for experience in data analysis, statistical modeling, and machine learning techniques. Proficiency in analytical tools like Python, R, and libraries such as Pandas and NumPy for data analysis and modeling is expected. Strong expertise in Power BI for data visualization, data modeling, and DAX queries, along with experience in implementing Row-Level Security in Power BI, is highly valued. The successful candidate should also demonstrate proficiency in SQL Server and query optimization, expertise in application data design and process management, and extensive knowledge of data modeling. Hands-on experience with Azure Data Factory, Azure Databricks, SSIS (SQL Server Integration Services), and SSAS (SQL Server Analysis Services) is required. Additionally, familiarity with big data technologies such as Hadoop, Spark, and Kafka for large-scale data processing, as well as an understanding of data governance, compliance, and security measures within Azure environments, will be advantageous. Overall, this role offers the opportunity to work on medium-complex data models, understand application data design and processes quickly, and contribute to the optimization of data analysis processes within the organization.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
gandhinagar, gujarat
On-site
The Senior/Lead Database Developer will play a critical role in the design, development, and maintenance of database systems. With over 8 years of experience in SQL Server, ETL processes, API development, and a solid understanding of MongoDB and AWS, the ideal candidate will be detail-oriented and passionate about database technologies. Your responsibilities will include designing, developing, and maintaining robust SQL Server databases to support various applications. You will also be tasked with implementing ETL processes for efficient data integration, developing APIs for seamless data exchange, and optimizing database performance to ensure data integrity and security. Collaboration with cross-functional teams to deliver database solutions that align with business requirements is essential. Technical expertise in MongoDB and contribution to NoSQL database solutions, as well as utilizing AWS services for database infrastructure and cloud-based solutions, are key aspects of the role. Additionally, you will be responsible for troubleshooting, debugging, and optimizing databases, staying updated on industry trends, and best practices in database development. Requirements for this role include a Bachelor's degree in computer science, Information Technology, or a related field, along with 8+ years of hands-on experience in SQL Server, ETL tools, API development, MongoDB, and AWS services. Strong problem-solving skills, attention to detail, excellent communication, and teamwork abilities are also expected. The ability to work in a fast-paced environment, manage multiple priorities, and experience in data warehousing, cloud platforms, Agile methodologies, and managing remote teams are preferred skills. This position is located in Gandhinagar with the following schedule and shift timings: 5:30 PM to 2:30 AM or 6:30 PM to 3:30 AM. If you have experience in Contact center domain projects, it would be a valuable asset to bring to the team at Etech Insights.,
Posted 1 month ago
0.0 years
2 - 2 Lacs
in
On-site
About the job: We are looking for a talented Data Engineer to join our team at Aaptatt! You will have the opportunity to work on cutting-edge technologies like Python, SQL, ETL processes, Snowflake, and Databricks to build and maintain our data infrastructure. Key responsibilities: 1. Knowledge in Data engineering and Support activities. 2. Strong expertise in SQL (advanced queries, optimization, indexing). 3. Proficiency in Python, Shell scripting for automation. 4. Knowledge of Data warehousing concepts (star schema, partitioning, etc.). 5. Good understanding of monitoring tools. 6. Good to have understanding, technical knowledge on ETL tools (Informatica) and Cloud data platforms (Snowflake, Databricks). If you are a passionate Data Engineer with a strong foundation in Python, SQL, ETL processes, Snowflake, and Databricks, we would love to hear from you! Join us at Aaptatt and be part of a dynamic team that is shaping the future of data-driven decision-making. Apply now and take your career to the next level! Who can apply: Only those candidates can apply who: are Computer Science Engineering students Salary: ₹ 2,00,100 - 2,16,000 /year Experience: 0 year(s) Deadline: 2025-09-12 23:59:59 Other perks: 5 days a week, Health Insurance Skills required: Python, SQL, ETL processes, Snowflake and Databricks About Company: Aaptatt is a universal company that has provided services in the cloud and DevOps areas for the last 4 years. Our team works in a unique manner that not only delivers projects on time but also ensures the highest standards of quality.
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior SQL & Reporting Automation Specialist at our client, a US-based financial services organization, you will be responsible for leading data architecture, reporting automation, and cloud migration initiatives. Your passion for building scalable data systems and driving insights through automation will be key in this role. This full-time hybrid position based in India offers the flexibility of some work from home. Your primary tasks will involve designing, developing, and maintaining SQL databases, as well as creating and automating reporting solutions to ensure data integrity and accuracy. You will be writing complex SQL queries, optimizing database performance, developing ETL processes, and generating automated reports. Collaboration with cross-functional teams to address business reporting needs and conducting data analysis to support decision-making will also be part of your responsibilities. To excel in this role, you should possess proficiency in SQL database design, development, and maintenance, along with expertise in writing complex SQL queries, optimizing database performance, and developing and automating reporting solutions. Knowledge of ETL processes, data analysis techniques, and experience with reporting tools like Power BI, Tableau, or similar are essential. Strong analytical and problem-solving skills, excellent communication, and collaboration abilities are also required. If you have a Bachelor's degree in Computer Science, Information Technology, or a related field and are comfortable working in a hybrid environment that combines on-site and remote work, this opportunity could be your next big career move.,
Posted 1 month ago
0.0 - 4.0 years
0 Lacs
karnataka
On-site
Unacademy aims to build the world's largest online knowledge repository for multi-lingual education. We use technology to empower great educators and create a community of self-learners. Our vision is to partner with the brightest minds and democratise education for everyone looking to learn. Join us in our journey to change the future of education. Key Responsibilities Collect, clean, and organize data from various internal and external sources. Perform exploratory data analysis to identify trends, patterns, and anomalies. Create and maintain reports, dashboards, and visualizations using tools like Excel, Power BI, or Tableau. Assist in data quality checks and validation to ensure accuracy and completeness. Work with stakeholders to understand business requirements and provide data-driven solutions. Document processes, findings, and recommendations in a clear and structured manner. Support the data team in ad-hoc analysis and ongoing projects. Required Skills & Qualifications Currently pursuing or recently completed a degree in Data Science, Statistics, Computer Science, Mathematics, Economics, or a related field. Strong knowledge of Excel and basic understanding of SQL. Familiarity with data visualization tools (e.g., Power BI, Tableau, Google Data Studio). Basic understanding of statistical methods and data analysis concepts. Good problem-solving skills with attention to detail. Strong communication skills to present findings effectively. Preferred Skills (Good to Have) Experience with Python or R for data analysis. Knowledge of ETL processes and database management. Understanding of business KPIs and metrics.,
Posted 1 month ago
9.0 - 13.0 years
0 Lacs
maharashtra
On-site
As a Data Analyst & Visualization Specialist at Citi, you will be responsible for designing and building high-performance Analytical Dashboards to be used by senior stakeholders. You will collaborate with business and IT teams to create impactful analytical solutions using a variety of data sources. Your role will involve enhancing data quality processes to ensure accuracy and completeness of reports and dashboards. Additionally, you will lead reengineering efforts and provide sophisticated analysis to define problems and develop innovative solutions. Your qualifications for this role include having 9+ years of experience in the Banking or Finance industry, with a broad understanding of the business technology landscape. You should possess strong analytical skills and be proficient in SQL, Python, Tableau, or Power BI. Experience with ETL processes, data formats, and troubleshooting technical issues is also required. Furthermore, you should have a good foundation in Data Science key concepts and various libraries. As a part of the Data Governance Job Family at Citi, you will play a vital role in managing and implementing successful projects. Effective communication, collaboration, and adaptability are key traits for success in this role. A Bachelor's/University degree or equivalent experience is required for this position. If you are looking for a challenging opportunity to work with a dynamic team and contribute towards achieving business objectives, this role at Citi might be the right fit for you.,
Posted 1 month ago
8.0 - 13.0 years
0 Lacs
karnataka
On-site
You will be joining MRI Software as a Data Engineering Leader responsible for designing, building, and managing data integration solutions. Your expertise in Azure Data Factory and Azure Synapse analytics, as well as data warehousing, will be crucial for leading technical implementations, mentoring junior developers, collaborating with global teams, and engaging with customers and stakeholders to ensure seamless and scalable data integration. Your key responsibilities will include leading and mentoring a team of data engineers, designing and implementing Azure Synapse Analytics solutions, optimizing ETL pipelines and Synapse Spark workloads, and ensuring data quality, security, and governance best practices. You will also collaborate with business stakeholders to develop data-driven solutions. To excel in this role, you should have 8-13 years of experience in Data Engineering, BI, or Cloud Analytics, with expertise in Azure Synapse, Data Factory, SQL, and ETL processes. Strong leadership, problem-solving, and stakeholder management skills are essential, and knowledge of Power BI, Python, or Spark would be advantageous. Deep knowledge of Data Modelling techniques, ETL Pipeline development, Azure Resources Cost Management, and data governance practices will also be key to your success. Additionally, you should be proficient in writing complex SQL queries, implementing best security practices for Azure components, and have experience in Master Data and metadata management. Your ability to manage a complex business environment, lead and support team members, and advocate for Agile practices will be highly valued. Experience in change management, data warehouse architecture, dimensional modelling, and data integrity validation will further strengthen your profile. Collaboration with Product Owners and data engineers to translate business requirements into effective dimensional models, strong SQL skills, and the ability to extract, clean, and transform raw data for dimensional modelling are essential aspects of this role. Desired skills include Python, real-time data streaming frameworks, and AI and Machine Learning data pipelines. A degree in Computer Science, Software Engineering, or related field is required for this position. In return, you can look forward to learning leading technical and industry standards, hybrid working arrangements, an annual performance-related bonus, and other benefits that foster an engaging, fun, and inclusive culture at MRI Software. MRI Software is a global Proptech leader dedicated to empowering real estate companies with innovative applications and hosted solutions. With a flexible technology platform and a connected ecosystem, MRI Software caters to the unique needs of real estate businesses worldwide. Operating across multiple regions, MRI Software boasts nearly five decades of expertise, a diverse team of over 4000 professionals, and a commitment to Equal Employment Opportunity.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
ahmedabad, gujarat
On-site
You are looking for a skilled and strategic BI Solutions Manager to take charge of our Business Intelligence and Automation team. Your primary focus will be on leveraging your technical expertise and leadership skills to drive insights, streamline processes, and shape a data-driven future for the organization. To excel in this role, you should possess a Bachelor's or Master's degree in computer science, Data Analytics, Information Systems, or a related field, along with at least 6-10 years of experience in BI/Analytics, including 2+ years in a leadership or managerial role. Your proficiency in tools like Power BI, Power Apps, Microsoft Power Platform, and Python for automation and data engineering tasks will be crucial. Additionally, a solid understanding of Azure cloud services, SQL, data warehousing, ETL processes, and data modeling is required. Your responsibilities will include leading, mentoring, and growing a team of BI developers and automation specialists, overseeing the development and maintenance of dashboards and reports, guiding the development of Power Apps and custom business solutions, and managing and optimizing Azure-based data solutions. You will collaborate with stakeholders to identify automation and analytics opportunities, evaluate and implement BI tools, and establish a roadmap to scale BI and automation solutions across the organization. Moreover, you will ensure data governance, security, and compliance best practices, mentor team members for continuous learning and innovation, and stay updated with the latest BI trends and technologies to keep the solutions cutting-edge. Tracking KPIs to measure BI adoption, performance, and ROI will also be part of your role. If you have the required qualifications and experience, please send your updated resume to sagar.raisinghani@advantmed.com along with details of your Total Experience, Relevant Experience, Current Designation, Current CTC, Expected CTC, Notice Period, and Current Location. About Advantmed: Advantmed LLC is a healthcare information management company based in California, founded in 2005. We specialize in providing healthcare organizations with innovative risk adjustment and quality improvement solutions that drive better financial and clinical outcomes.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Database Administrator (DBA) at Domo, you will play a crucial role in managing critical components of our SaaS platform in India. Leveraging the power of AWS and other Cloud Data Warehouse providers, you will be responsible for designing and implementing efficient data models, maintaining data pipelines, and optimizing data storage and indexing strategies to enhance query performance. Your expertise in Data Warehouses, cloud technologies, and performance optimization techniques will be essential in collaborating with cross-functional teams to deliver innovative data solutions. Key Responsibilities: - Data Modeling and Engineering: Design and implement efficient data models, create data pipelines, and optimize data storage and indexing strategies. - Cloud Integration: Integrate Cloud Data Warehouses with AWS and Azure services, configure and manage cloud resources, and implement automated deployment processes. - Performance Optimization: Analyze query performance, tune queries, implement caching strategies, and monitor resource utilization. - Security and Compliance: Implement security measures, monitor security policies, conduct audits, and ensure compliance with industry standards. - Backup and Recovery: Implement backup and recovery strategies, test recovery procedures, and implement disaster recovery plans. - Troubleshooting and Support: Diagnose and resolve database issues, provide technical support, and collaborate with other teams for smooth operations. Job Requirements: - Bachelor's degree in computer science or related field. - 5+ years of experience as a Database Administrator. - Proficiency in SQL, Python, data modeling, and ETL processes. - Knowledge of cloud platforms (AWS and Azure) and security best practices. - Strong analytical, problem-solving, communication, and interpersonal skills. - Experience with infrastructure as code (IaC) tools. - Availability for on-call support and out-of-band requests. Location: Pune, Maharashtra, India India Benefits & Perks: - Medical insurance, maternity and paternity leave policies. - Baby bucks, Haute Mama, annual leave, holidays, sick leaves, Sodexo Meal Pass. - Health and Wellness Benefit, One-time Technology Benefit. - Corporate National Pension Scheme, Employee Assistance Programme. - Marriage leaves, Bereavement leaves. Domo is an equal opportunity employer.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
As a Senior AWS Data Engineer with over 8+ years of experience, you will be responsible for designing and implementing robust, scalable, and efficient data pipelines and architectures on AWS. Your deep understanding of data engineering principles and extensive experience with AWS services will be instrumental in developing data models and schemas to support business intelligence and analytics requirements. You will utilize AWS services such as S3, Redshift, EMR, Glue, Lambda, and Kinesis to build and optimize data solutions, implementing data security and compliance measures using AWS IAM, KMS, and other security services. Your role will also involve designing and developing ETL processes to ingest, transform, and load data from various sources into data warehouses and lakes. Ensuring data quality and integrity through validation, cleansing, and transformation processes will be a key aspect of your responsibilities. Additionally, you will optimize data storage and retrieval performance through indexing, partitioning, and other techniques, monitoring and troubleshooting data pipelines to ensure high availability and reliability. Collaborating with cross-functional teams, including data scientists, analysts, and business stakeholders, you will work to understand data requirements and deliver solutions. Providing technical leadership and mentorship to junior data engineers and team members, as well as identifying opportunities to automate and streamline data processes for increased efficiency, are essential components of this role. Participation in on-call rotations to provide support for critical systems and services is also expected. Required qualifications for this role include experience in software development and data engineering, with demonstrable hands-on experience in Python and PySpark. Proven experience with cloud platforms such as AWS, Azure, or Google Cloud, along with a good understanding of data modeling, data architecture, ETL processes, and data warehousing concepts, are necessary. Experience or good knowledge of cloud native ETL platforms like Snowflake and/or Databricks, as well as big data technologies and services like AWS EMRs, Redshift, Lambda, S3, are also required. Proven experience with efficient Cloud DevOps practices and CI/CD tools like Jenkins/Gitlab for data engineering platforms, good knowledge of SQL and NoSQL databases, including performance tuning and optimization, and experience with declarative infra provisioning tools like Terraform, Ansible, or CloudFormation are essential. Strong analytical skills to troubleshoot issues and optimize data processes, working independently and collaboratively, are also important for this role. Preferred qualifications include knowledge of machine learning model lifecycle, language models, and cloud-native MLOps pipelines and frameworks, as well as familiarity with data visualization tools and data integration patterns.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You are invited to join our dynamic Technology Service Delivery team in Hyderabad as a Quality Assurance Analyst II, Support Ops. In this role, you will play a crucial part in ensuring the accuracy, integrity, and quality of the work carried out by our Data Analysts, Data Engineers, and Client Platform Engineers. Your strong understanding of data processes, analytics, and quality assurance methodologies will be instrumental in validating various data-related tasks and outputs within a legal services environment. Please note that this role requires you to be available from 6 PM IST to 2 AM IST. Your day-to-day responsibilities will include developing advanced Python scripts, conducting comprehensive quality assurance on data updates, inserts, and modifications, validating and QA of complex SQL and Python scripts, reviewing and ensuring the quality of data migrations, custom reports, and self-service reporting systems, verifying the accuracy and effectiveness of data visualization tools, techniques, metrics, and dashboards, conducting thorough QA on ad-hoc data requests, test scenarios, and data analysis in a production environment, validating web form builds and electronic data collection templates, reviewing and verifying daily tickets submitted by business units for various data-related tasks, conducting audits to ensure compliance with data quality standards, metadata management, and data lineage practices, collaborating with data analysts and engineers to resolve quality issues and improve processes, contributing to the establishment and implementation of process improvements, documenting QA processes, maintaining up-to-date quality assurance documentation, providing feedback on data quality management principles, including metadata, lineage, and business definitions, and assisting in validating data cleansing and manipulation processes to ensure data is in the correct format. To be successful in this role, you should have a Bachelor's degree in Computer Science, Information Systems, or a related field, proficiency/strong knowledge of Python and SQL, at least 5 years of experience in quality assurance, preferably in a data-intensive and legal services environment, a strong understanding of data processes, analytics, and database concepts, experience with data validation tools and techniques, familiarity with data visualization tools, especially Power BI, knowledge of big data technologies such as Spark and Databricks, an understanding of cloud computing platforms (Azure, AWS), experience with version control systems (e.g., Git), strong problem-solving and critical thinking abilities, excellent communication skills, both written and verbal, and the ability to work independently and as part of a team. Your required skills include strong proficiency in Python, proficiency in SQL, advanced data quality assurance techniques, experience in validating ETL processes and data migrations, ability to review and QA complex SQL queries and query optimization, quality control processes for both data and web applications, web form validation and database mapping comprehension, technical documentation and process improvement, exceptional attention to detail and accuracy, time management and prioritization in a fast-paced environment, advanced Excel skills for data validation, and experience with version control systems (e.g., Git). Preferred skills for this role include experience in the legal services industry or similar professional services environment, familiarity with Power BI for validating data visualizations and Agile methodologies, knowledge of data privacy and security best practices, experience in a production operations environment, understanding of statistical analysis methods for effective QA, and familiarity with Agile methodologies. About Kroll: Kroll is the premier global valuation and corporate finance advisor with expertise in complex valuation, disputes and investigations, M&A, restructuring, and compliance and regulatory consulting. Our professionals balance analytical skills, deep market insight, and independence to help our clients make sound decisions. As an organization, we think globally and encourage our people to do the same. Kroll is committed to equal opportunity and diversity, and recruits people based on merit. If you are interested in this position, please formally apply via careers.kroll.com.,
Posted 1 month ago
1.0 - 5.0 years
0 - 0 Lacs
hyderabad, telangana
On-site
As a BE with 1 to 3 years of experience in databases, you will be responsible for day-to-day tasks related to database management, ETL processes, data analysis, and ensuring data integrity and security. This full-time on-site role is located in Hyderabad and offers the opportunity to work in the IT consulting services sector within the banking industry. Your key responsibilities will include managing databases, performing data analysis, and working with ETL processes. You should have a strong understanding of database systems and query languages, along with experience in ETL processes. Problem-solving skills and analytical abilities will be crucial in this role. In addition to technical skills, good communication and teamwork abilities are essential for effective collaboration with team members. A Bachelor's degree in Computer Science, Information Technology, or a related field is required. Prior experience in the banking sector will be advantageous. The salary for this role is in the range of 40-60K along with a travel allowance for 15 days of travel. If you are looking to further develop your career in IT consulting services within the banking sector and have the required qualifications and skills, we encourage you to apply for this exciting opportunity.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Database Developer and Designer, your primary responsibility will involve building and maintaining Customer Data Platform (CDP) databases to ensure optimal performance and stability. You will be tasked with optimizing SQL queries using various techniques to enhance query performance. Additionally, your role will entail creating visual representations of information systems, maintaining database security, and facilitating accessibility primarily within CDP frameworks. Troubleshooting and debugging SQL code issues will be a crucial aspect of your job, where you will identify and resolve any potential issues efficiently. You will also be responsible for data integration tasks, including importing and exporting events, user profiles, and audience changes to platforms like BigQuery. Utilizing BigQuery for auditing, querying, and reporting through data visualization tools will also fall under your purview. Authorization management will be another key area of focus, where you will set up and manage user and service account authorizations. Managing the integration of Lytics with BigQuery and other data platforms, as well as handling data export and import processes between Lytics and BigQuery, will be integral to your responsibilities. Moreover, you will play a pivotal role in configuring user and service account authorizations for data access and utilizing data from various source systems to create and integrate with CDP data models and package products. Preferred candidates will possess experience with Lytics CDP and certification in CDP, demonstrating hands-on experience with at least one Customer Data Platform technology and a comprehensive understanding of the Digital Marketing Eco-system. In terms of required skills, you should exhibit proficiency in SQL and database management, along with strong analytical and problem-solving abilities. Experience in data modeling, database design, and the ability to optimize and troubleshoot SQL queries will be essential. Expertise in Google BigQuery and data warehousing, knowledge of data integration and ETL processes, and familiarity with Google Cloud Platform and its services are also prerequisites. A strong understanding of data security, access management, and proficiency in Lytics and its integration capabilities are critical for this role. Additionally, candidates should have experience with data import/export processes, knowledge of authorization methods and security practices, strong communication, and project management skills. The ability to adapt to new CDP technologies and deliver results in a fast-paced environment is also expected. Overall, this role is vital for efficient data management, enabling informed decision-making through optimized database design, and seamless integration within CDP frameworks.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
coimbatore, tamil nadu
On-site
About Mindcurv (Part of Accenture Song) Mindcurv, a part of Accenture Song, assists customers in reimagining their digital business, experiences, and technology to thrive in the new digital landscape. By crafting sustainable and accountable solutions for individuals residing in a digital world, Mindcurv addresses the market's imperative to digitalize business processes and enhance customer experiences while capitalizing on cloud technologies through the adoption of DevOps and agile methodologies. Your Role As a Senior Data Engineer at Mindcurv, you will play a key role within our expanding Data and Analytics team. We are looking for a candidate with extensive proficiency in Databricks and cloud data warehousing, who has a proven history of developing scalable data pipelines, optimizing data architectures, and facilitating robust analytics capabilities. In this position, you will collaborate with diverse teams to ensure that the organization harnesses data as a strategic asset. Your responsibilities will include: - Designing, constructing, and managing scalable data pipelines and ETL processes using Databricks and other contemporary tools. - Architecting, implementing, and overseeing cloud-based data warehousing solutions on Databricks, following the Lakehouse Architecture model. - Creating and maintaining optimized data lake architectures to sustain advanced analytics and machine learning applications. - Engaging with stakeholders to elicit requirements, devise solutions, and ensure the delivery of high-quality data. - Enhancing data pipelines for improved performance and cost efficiency. - Enforcing best practices for data governance, access control, security, and compliance in the cloud. - Monitoring and troubleshooting data pipelines to ensure their reliability and accuracy. - Mentoring junior engineers to cultivate a culture of continuous learning and innovation. - Demonstrating excellent communication skills and the ability to collaborate effectively with clients primarily based in Western Europe. Who You Are To excel in this role, you should possess the following qualifications: - Bachelor's or Master's degree in Computer Science or a related field. - At least 5 years of experience in data engineering roles focused on cloud platforms. - Proficiency in Databricks, encompassing Spark-based ETL, Delta Lake, and SQL. - Solid experience with a major cloud platform (preferably AWS). - Hands-on experience with Delta Lake, Unity Catalog, and concepts related to Lakehouse architecture. - Strong programming skills in Python and SQL, with experience in Pyspark considered a plus. - Sound understanding of data modeling principles, including star schema and dimensional modeling. - Familiarity with CI/CD practices, version control systems like Git, and data governance and security standards such as GDPR and CCPA compliance. Nice-to-Have Qualifications Additionally, the following qualifications would be advantageous: - Experience with Airflow or similar workflow orchestration tools. - Exposure to machine learning workflows and MLOps. - Certification in Databricks and AWS. - Familiarity with data visualization tools like Power BI. What Do We Offer You At Mindcurv, we provide various perks such as refreshments, team events, and attractive compensation packages. Apart from these, we offer intellectually stimulating projects involving cutting-edge technologies, an agile and entrepreneurial atmosphere devoid of office politics, work-life balance, transparent culture, and a management team that values feedback. Our High Performers Individuals who excel at Mindcurv are self-starters, team players, and continuous learners who thrive in ambiguous situations. We empower our employees with the necessary resources for success, encourage exploration within their domain, and offer continuous growth opportunities to enrich their careers. Ready for Change If you are prepared for the next phase in your career - a role that allows you to be authentic and bring out the best in yourself, your colleagues, and clients - do not hesitate to apply for this job opportunity now. Join us at Mindcurv and embark on a journey of professional growth and fulfillment. (Note: The Job Description has been summarized and rephrased for clarity and readability.),
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You are a Data Engineer with 5-8 years of experience in AWS Data Engineering. Your role involves implementing data lake and warehousing strategies to support analytics, AI, and machine learning initiatives. You will be responsible for developing and maintaining scalable data pipelines, managing data storage solutions using S3 buckets, and optimizing data retrieval and query performance. Your expertise in Python, proficiency in Snowflake and AWS services, and strong understanding of data warehousing, ETL processes, and cloud data storage will be crucial in this role. You will collaborate with cross-functional teams to deliver solutions that align with business goals, ensure compliance with data governance and security policies, and define strategies to leverage existing large datasets. Additionally, you will be tasked with identifying relevant data sources for client business needs, mining big-data stores, cleaning and validating data, and building cloud-based solutions with AWS Sagemaker and Snowflake. Your problem-solving skills, ability to work in a dynamic environment, and strong communication skills will be essential for effective collaboration and documentation. In this role, you will play a key part in building scalable data solutions for AI and Machine Learning initiatives, leveraging AWS and Snowflake technologies to support data infrastructure needs in the Fintech Capital market space. Being part of a team at FIS, a leading Fintech provider, you will have the opportunity to work on challenging and relevant issues in financial services and technology.,
Posted 1 month ago
6.0 - 13.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
You are required to have over 13 years of experience in IT, with at least 6 years of experience in roles such as Technical Product Manager, Technical Program Manager, or Delivery Lead. Your responsibilities will involve overseeing the end-to-end delivery of data platform, AI, BI, and analytics projects to ensure they align with business objectives and stakeholder expectations. You will be responsible for developing and maintaining project plans, roadmaps, and timelines for data ingestion, transformation, governance, AI/ML models, and analytics deliverables. Leading cross-functional teams, including data engineers, data scientists, BI analysts, architects, and business stakeholders, will be part of your role to deliver high-quality and scalable solutions within the set budget and timeframe. You will define, prioritize, and manage product and project backlogs focusing on data pipelines, data quality, governance, AI services, and BI dashboards. Collaboration with business units to gather requirements and translate them into actionable user stories and acceptance criteria will be essential. Your responsibilities will extend to overseeing BI and analytics areas, ensuring data quality, lineage, security, and compliance requirements are incorporated throughout the project lifecycle. Coordinating UAT, performance testing, and user training, as well as acting as the primary point of contact for project stakeholders, will be crucial. Additionally, you will facilitate agile ceremonies, drive post-deployment monitoring, and optimize data and BI solutions to meet evolving business needs and performance standards. Primary Skills: - 13+ years of IT experience with 6+ years in relevant roles - Hands-on experience in data engineering, data pipelines, ETL processes, and data integration workflows - Proven track record managing data engineering, analytics, or AI/ML projects end to end - Strong understanding of modern data architecture and cloud platforms (Azure, AWS, GCP) - Proficiency in Agile methodologies, sprint planning, and backlog grooming - Excellent communication and stakeholder management skills Secondary Skills: - Background in computer science, engineering, data science, or analytics - Experience with data engineering tools and services in AWS, Azure & GCP - Understanding of BI, Analytics, LLMs, RAG, prompt engineering, or agent-based AI systems - Experience leading cross-functional teams in matrixed environments - Certifications such as PMP, CSM, SAFe, or equivalent are a plus Role: Technical Project Manager (Data) Location: Trivandrum/Kochi Close Date: 08-08-2025,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
We are looking for a skilled Database Developer to become a part of our team. As a Database Developer with 3-6 years of experience in database development, you will need to have a strong grasp of database design principles, SQL programming, and data modeling. Your primary responsibilities will include designing, implementing, and maintaining database solutions to cater to our organization's data requirements. Key Responsibilities: - Design, develop, and maintain relational databases for various applications and systems within the organization. - Conduct database tuning and optimization to ensure top-notch performance and scalability. - Create SQL queries, stored procedures, and functions to aid application development and data retrieval. - Collaborate with diverse teams to collect requirements and devise database solutions aligning with business needs. - Perform data modeling to construct logical and physical database structures. - Establish and uphold database security measures to safeguard sensitive data. - Quickly troubleshoot and resolve any database-related issues. - Properly document database design, processes, and procedures. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 3-6 years of hands-on experience in database development. - Proficiency in SQL programming and familiarity with relational database management systems such as MySQL, PostgreSQL, SQL Server. - Sound knowledge of database design principles and data modeling techniques. - Experience in database tuning, optimization, and performance monitoring. - Understanding of database security best practices. - Excellent problem-solving abilities with keen attention to detail. - Capability to work autonomously and collaboratively in a team setup. - Strong communication and interpersonal skills. Preferred Qualifications: - Exposure to NoSQL databases like MongoDB, Cassandra would be an added advantage. - Knowledge of ETL (Extract, Transform, Load) processes and associated tools. - Familiarity with cloud-based database services such as AWS RDS, Azure SQL Database. - Possession of certification in database administration or development is a desirable trait.,
Posted 1 month ago
6.0 - 8.0 years
3 - 15 Lacs
Hyderabad, Telangana, India
On-site
Description We are seeking an experienced Database Engineer to join our dynamic team in India. The ideal candidate will have a strong background in database design and administration, with a proven ability to optimize performance and ensure data integrity. You will play a crucial role in maintaining our database systems and supporting our application development teams. Responsibilities Design, implement, and maintain database systems that support business applications. Optimize database performance and ensure high availability and reliability. Develop and implement data backup and recovery strategies. Monitor database performance and troubleshoot issues as they arise. Collaborate with software developers to create and optimize database queries and stored procedures. Ensure data security and integrity through appropriate measures and practices. Document database designs, processes, and procedures for future reference. Skills and Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. 6-8 years of experience in database design, development, and administration. Strong knowledge of SQL and experience with database management systems such as MySQL, PostgreSQL, Oracle, or SQL Server. Experience in performance tuning and optimization of database queries. Familiarity with database backup and recovery techniques. Knowledge of data modeling and database architecture principles. Ability to write complex SQL queries and stored procedures. Understanding of data security principles and practices. Strong analytical and problem-solving skills.
Posted 1 month ago
6.0 - 8.0 years
3 - 15 Lacs
Pune, Maharashtra, India
On-site
Description We are seeking an experienced Azure Databricks professional to join our team in India. The ideal candidate will have a strong background in data engineering and be proficient in leveraging Azure Databricks to build robust data solutions. Responsibilities Design and implement data pipelines using Azure Databricks to process large datasets efficiently. Collaborate with data engineers and data scientists to build scalable and maintainable data solutions. Optimize existing Databricks workflows for performance and reliability. Monitor and troubleshoot Azure Databricks jobs to ensure data integrity and availability. Develop and maintain documentation for data processes and architectures. Skills and Qualifications 6-8 years of experience in data engineering or related field. Strong proficiency in Apache Spark and Databricks platform. Hands-on experience with Azure services, particularly Azure Data Lake, Azure SQL Database, and Azure Data Factory. Proficient in programming languages such as Python, Scala, or SQL. Experience with data modeling, ETL processes, and data warehousing concepts. Familiarity with DevOps practices and CI/CD pipelines in a cloud environment. Ability to analyze and visualize data using tools like Power BI or Tableau.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |