UserReady is a digital product design and development agency focused on building intuitive user experiences and effective digital solutions.
Not specified
INR 6.0 - 9.0 Lacs P.A.
Work from Office
Full Time
Role - Senior Tableau Consultant Project - CTI Tableau Specialist Job Details: Tableau Dashboard Development, Art of storytelling, Data Literacy and Enablement, Advance Server Administration ( Server Backup, License Management, Security in Tableau, Tableau Content CI/CD Process), Performance Management Note: We need to map internal UR resources before we go out to the market. Skills Required Tableau - Dashboard - Development - Art - of - storytelling - Data - Literacy - and - Enablement - Server - Backup - License - Management - Security - in - Tableau - Tableau - Content - CI/CD - Process - Performance - Management Desirable Skills Tableau - Dashboard - Development - Art - of - storytelling - Data - Literacy - and - Enablement - Server - Backup - License - Management - Security - in - Tableau - Tableau - Content - CI/CD - Process - Performance - Management Years Of Exp 5 to 8 years
Not specified
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Snowflake Migration Specialist will play a key role in the migration process, ensuring the successful transition of data from legacy systems to Snowflake. This position requires a deep understanding of Snowflake, data warehousing concepts, and cloud-based data architectures. The ideal candidate will have experience in managing data migrations, troubleshooting, and optimizing the performance of data platforms. Key Responsibilities: Lead the end-to-end migration of data and workloads from on-premises data platforms (e.g., Oracle, SQL Server, Teradata) or cloud-based systems to Snowflake. Migration of legacy/on-prem ETL/ELT workloads to modern ETL/ELT workloads that works better with Snowflake (Matillion, Fivetran, dbt etc.) Analyze current data architectures, workloads, and business requirements to design optimal solutions for migration to Snowflake. Work closely with stakeholders, including data engineers, business analysts, and IT teams, to understand requirements and provide guidance on Snowflake best practices. Develop and execute migration strategies for moving large volumes of data to Snowflake while minimizing downtime and ensuring data integrity. Optimize Snowflake workloads, including query performance tuning, data model optimization, and managing storage costs. Troubleshoot issues during the migration process and provide timely resolutions. Automate migration tasks and processes using scripts, tools, and APIs to streamline the process and increase efficiency. Collaborate with security teams to ensure compliance with data security and governance requirements. Provide documentation and knowledge transfer for best practices, migration processes, and post-migration support. Required Skills and Experience: Proven experience with Snowflake data platform, including migration, implementation, and optimization. Strong knowledge of data warehousing concepts and experience with cloud data platforms (Azure, AWS). Hands-on experience in migrating data from any one of the traditional databases (e.g., Oracle, SQL Server, Teradata) to Snowflake. Proficiency in SQL and experience with ETL processes, data integration, and data pipelines. Experience in optimizing data processing workloads and query performance within Snowflake. Familiarity with Snowflake data security and governance features, including role-based access control and data encryption. Knowledge of cloud infrastructure (AWS, Azure) and associated services (e.g., S3, Azure Blob Storage) used in Snowflake. Familiarity with data pipeline orchestration tools such as Apache Airflow or similar. Strong troubleshooting and problem-solving skills. Excellent communication skills and ability to work effectively with cross-functional teams. Ability to work independently, manage multiple priorities, and meet deadlines. Preferred Qualifications: Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is highly desirable. Experience with at two one ETL/ELT tools like SSIS, Talend, Informatica, Matillion or the likes. Knowledge of data modeling and schema design in Snowflake. Experience with Python or other programming languages for automation and migration tasks. Knowledge of Cloud Platform services (AWS, Azure) that are used in Snowflake Familiarity with CI/CD pipelines for data engineering.
Not specified
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Visier, an HR Analytics software company is looking for a BI Analyst to join their delivery team The delivery team assists new customers with setting up the platform and assists with connecting the customers data/data platforms to the Visier application The position requires general BI Analyst skills, knowledge/general experience/understanding of data and data platforms, previous experience working with customers (not just as part of a team where someone else was interacting with the customer), good communication skills, ability to work independently and also consult with team members
Not specified
INR 8.0 - 13.0 Lacs P.A.
Work from Office
Full Time
Job Summary:We are seeking a skilled Tableau Server Administrator with expertise in Linux environments. The role involves installing, configuring, and maintaining Tableau Server on Linux to ensure optimal performance, security, and availability. You will work closely with data teams to support business intelligence needs and ensure efficient server operation. Role requires to operate from Client office in Bangalore.Key Responsibilities:Install, configure and manage Tableau Server on a Linux-based system.Perform troubleshooting, server optimisation and performance tuning for Tableau Server.Ensure the security, backup and recovery of Tableau Server.Collaborate with cross-functional teams to manage the server.Monitor and ensure the smooth functioning of Tableau Server, including user management, permissions and upgrades.Automate tasks using Tableau REST API, Tabcmd, and shell scripting.Required Skills & Experience:Proven experience in managing Tableau Server on Linux is a must.In-depth knowledge of Linux environments, including installation, maintenance and security.Strong troubleshooting and problem-solving skills in a server administration context.Experience with Tableau Server upgrades, backups and configuration management.Familiarity with scripting (Bash, Python) for automation and maintenance tasks is a plus.Excellent communication and team collaboration skills.Mandatory SkillsTableau Administration, Bash Scripting, Linux
Not specified
INR 8.0 - 10.0 Lacs P.A.
Work from Office
Full Time
Candidate should have following relevant experience. Data Modeller with min 6 yrs exp Strong understanding of ETL and data model Good knowledge in pyspark Excellent communication and presentation skill.
Not specified
INR 12.0 - 22.0 Lacs P.A.
Hybrid
Full Time
Role & responsibilities:Key Responsibilities: Design, develop, and maintain scalable backend applications using Java and Spring Boot.Implement Microservices architecture, ensuring high availability and performance.Work with NoSQL databases (MongoDB) to manage and optimize data storage.Develop and deploy applications on AWS, utilizing cloud services effectively.Write unit and integration tests using JUnit to ensure code quality and reliability.Collaborate with front-end developers, product managers, and other stakeholders to deliver high-quality software solutions.Troubleshoot and optimize application performance and resolve production issues.Follow best practices for CI/CD, code reviews, and version control (Git).Skills & QUalification: Strong proficiency in Java (8 or above) with experience in backend development.Expertise in Spring Boot and its ecosystem for developing RESTful APIs and Microservices.Hands-on experience with Microservices architecture and service-oriented design.Experience with NoSQL databases (MongoDB preferred) for efficient data management.Proficiency in AWS services such as EC2, S3, Lambda, API Gateway, etc.Solid experience with JUnit and test-driven development (TDD) methodologies.Knowledge of containerization (Docker, Kubernetes) is a plus.Familiarity with CI/CD pipelines and DevOps practices is an advantage.Excellent problem-solving skills and ability to work in an Agile environment. Role:Develop new featuresBug fixesDev testingService deploymentProduction fix activities
Not specified
INR 9.0 - 13.0 Lacs P.A.
Work from Office
Full Time
Job Category: Information Technology Job Code: Data Engineer II:Data Engineer II Job Title: Data Engineer II Keywords: Number of Positions: 1 Remaining Positions: 1 Duties: Skills Request : Data Engineer II No. of position - 1 Location - McKinsey Gurugram office , Hybrid mode (NOT REMOTE) Duties: You will be a core member of McKinsey analytics platform team responsible for extracting large quantities data from clients IT systems, developing efficient ETL and data management processes, and building architectures for rapid ingestion and dissemination of key data. Primary Responsibilities: Enhancements, new development, defect resolution and production support of ETL development using AWS native services. This is a senior role hence Total Experience - 9 plus is mandatory Experience with Data Modelling (worked with different data sets and its integrations). Integrate data sets using AWS services such as Glue, Lambda functions. Use AWS SNS to send emails and Alerts. Author ETL processes using Python, Pyspark. ETL process monitoring using CloudWatch events. Connecting with different data sources like S3 and validating data using Athena. Competencies / Experience: Deep technical skills in AWS Glue (Crawler, Data Catalog): 5 years. Hands on experience on Python: 3 years. PL/SQL experience: 3 years. CloudFormation and Terraform: 2 years CI/CD Github actions : 2 years Good understanding of AWS services like S3, SNS, Secret Manager, Athena and Lambda: 1 years. Additionally, familiarity with any of the following highly desirable: Jira, Github, Python, Snowflake. Skills: Primary Responsibilities: Enhancements, new development, defect resolution and production support of ETL development using AWS native services. This is a senior role hence Total Experience - 9 plus is mandatory Experience with Data Modelling (worked with different data sets and its integrations). Integrate data sets using AWS services such as Glue, Lambda functions. Use AWS SNS to send emails and Alerts. Author ETL processes using Python, Pyspark. ETL process monitoring using CloudWatch events. Connecting with different data sources like S3 and validating data using Athena. Competencies / Experience: Deep technical skills in AWS Glue (Crawler, Data Catalog): 5 years. Hands on experience on Python: 3 years. PL/SQL experience: 3 years. CloudFormation and Terraform: 2 years CI/CD Github actions : 2 years Good understanding of AWS services like S3, SNS, Secret Manager, Athena and Lambda: 1 years. Additionally, familiarity with any of the following highly desirable: Jira, Github, Python, Snowflake. Education: Bachelors degree in quantitative field like Computer Science, Engineering, Statistics, Mathematics or related field required. Advanced degree is a strong plus Languages: English Read Write Speak Attachments:
Not specified
INR 20.0 - 25.0 Lacs P.A.
Work from Office
Full Time
In this role, you’ll make an impact in the following ways: • Design and development of features and components • Experience in using a specific application development toolkit and knowledge of front end and backend development coding languages such as Java, SQL, HTML, CSS, JSON, Angular, JavaScript. • Must also have Knowledge in application frameworks and containerization like Spring-boot, Docker. o Collaborating with other engineers in design and development o Help triage bugs, track software defects, and ensure their timely resolution o Follows technical standards and quality Interface with product and other functional teams and their leadership. o Programming well-designed, testable, efficient code. o Analyze, design and develop tests and test-automation suites. o Develop flowcharts, layouts and documentation to satisfy requirements and solutions. o Apply security and privacy principles. o Troubleshoot, debug and upgrade existing systems. o Ensure software is updated with latest features. o Participate in deployment process following all change controls. o Leverage existing products/functionality and promote reuse. • Collaborate with business users, project managers and engineers to achieve elegant solutions. • Programming well-designed, testable, efficient code. Analyze, design and develop tests and test-automation suites and has thorough knowledge of the Software Development Life Cycle. • Actively participate in code reviews and create test Plan and test Data. • S/he also ensures that expected application performance levels are achieved by coordinating, coding, testing, implementation and documentation. • Provide ongoing maintenance, support and enhancements in existing systems and platforms. Provide recommendations for continuous improvement. • Active learning engagement. Complete all required mandatory training / policy awareness curricula on time. Use learning tools such as BK Live to complete both recommended and aspirational targets set in personal development plans. • Demonstrate teamwork by working alongside other engineers on the team to elevate technology and consistently apply best practices and take shared responsibility for the overall efforts that the team has committed to. • Utilize local meetups to gain and share knowledge. • Acts as mentor to junior level engineers. To be successful in this role, we’re seeking the following: • Bachelor's degree in computer science engineering or a related discipline, or equivalent work experience required • 5-7 years of experience in software development. • Experience in the securities or financial services industry is a plus. • Job holder must have understanding interdependencies and business impact of future IT plans. • S/he must have prior lead experience selecting and implementing vendor-specific methodologies and prior consulting experience with structured methodologies. • Extensive experience with developing and supporting front end and back-end end development required. Job holder must have broad experience with multi-platform development tools and toolkits.
Not specified
INR 25.0 - 30.0 Lacs P.A.
Work from Office
Full Time
We are seeking a highly motivated and experienced Senior Data Engineer to join our growing data team. In this role, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines to support our data-driven initiatives. You will leverage your strong Python programming skills and deep understanding of data engineering principles to build efficient and reliable data solutions. Responsibilities: Design, develop, and maintain data pipelines for data extraction, transformation, and loading (ETL/ELT) processes. Write clean, efficient, and well-documented Python code for data processing and automation. Build and optimize data models and schemas for efficient data storage and retrieval. Collaborate with data scientists and analysts to understand their data requirements and provide data solutions. Monitor and troubleshoot data pipeline performance and ensure data quality. Implement data governance and security best practices. Contribute to the development of data engineering standards and best practices. Stay up-to-date with the latest data engineering technologies and trends. Optimize existing data pipelines for performance and scalability. Participate in code reviews and contribute to a collaborative development environment. Mandatory Skills: 5-8 years of experience in Data Engineering. Expert-level proficiency in Python programming. Strong understanding of data engineering principles and best practices. Experience with data modelling and schema design. Experience with data extraction, transformation, and loading (ETL/ELT) processes. Proven ability to write clean, efficient, and maintainable code. Ability to work independently and as part of a team. Strong problem-solving and analytical skills. Desirable Skills: Experience building and managing data pipelines. (Numpy, Pandas) some Experience with data warehousing solutions (e.g., Snowflake)
Not specified
INR 14.0 - 19.0 Lacs P.A.
Work from Office
Full Time
1) be at the forefront of creating visually impactful Tableau dashboards. 2) have expertise in translating business requirements into meaningful insights. 3) have experience in telling a story using a Tableau dashboard. 4) be able to lead others by example and through coaching. 5) have expertise in the Tableau Platform end-to-end, including how to use the different features, best practices arounf adminstration and familiarity with how Tableau can integrate with other platforms. Note required but, it would be helpful if the candidate understands Tableau architecture principles. The person needs to have excellent communication skills. Tableau certification(s) expected. The candidate will need to be able to showcase their Tableau Dashboards as part of the consideration process.
Not specified
INR 30.0 - 35.0 Lacs P.A.
Work from Office
Full Time
Job Description: We seek a highly skilled Data Engineer with 3-5 years of experience to join our team. The ideal candidate should have expertise in Snowflake, dbt, Python, and Airflow, along with strong SQL knowledge. Experience with cloud platforms (AWS, Azure, or GCP) is a plus. Responsibilities : Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion to Snowflake using Python, dbt, and Airflow. Write optimized and efficient SQL queries for data transformation, modeling, and analysis. Develop and maintain data models and data pipelines to ensure data integrity and performance. Collaborate with Data Analysts, Data Scientists, and Business teams to understand data requirements and optimize data workflows. Automate and monitor data pipelines using Airflow for scheduling and orchestration. Implement data quality checks, validation rules, and performance optimizations. Work with any one of the cloud platforms (AWS, Azure, GCP) to deploy and manage data infrastructure. Ensure security, governance, and compliance of data systems. Required Skills: Strong experience with Snowflake (schema design, performance tuning, Snowflake-specific features). Proficiency in SQL (writing complex queries, indexing, performance tuning). Experience with dbt (Data Build Tool) for data transformation and modeling. Python programming for automation and data processing. Experience with Airflow for workflow orchestration. Good understanding of data warehousing concepts and best practices. Nice-to-Have Skills: Experience with any one of the cloud platforms (AWS, Azure, or GCP). Familiarity with CI/CD for data pipelines and version control (Git, GitHub, Bitbucket). Experience with data governance, security, and compliance best practices. Exposure to other data tools like Kafka, Spark, or Terraform is a plus.
Not specified
INR 45.0 - 50.0 Lacs P.A.
Work from Office
Full Time
Job Summary: We are looking for an experienced Lead Data Engineer to drive data engineering initiatives, architect scalable data solutions, and lead a team of engineers. The ideal candidate will have 10+ years of experience , specializing in Snowflake, Matillion, and Cloud platforms (Azure/AWS) and expertise in BI tools like Power BI, Tableau . This role requires strong SQL skills, analytical capabilities, excellent communication, and leadership abilities . The candidate should be able to scope projects, interact with clients, and deliver efficient data solutions . Key Responsibilities: Data Architecture Engineering: Design and develop scalable, high-performance data solutions using Snowflake, Matillion, and cloud environments (Azure/AWS) . ETL/ELT Development: Build and optimize ETL/ELT pipelines leveraging Matillion, Snowflake native capabilities, dbt, and other data integration tools. BI Data Visualization: Use Power BI, Tableau, and DOMO to develop insightful reports and dashboards, ensuring seamless data flow from the backend to visualization layers. SQL Development Optimization: Write and optimize complex SQL queries for performance tuning, data transformation, and analytics. Cloud Data Management: Implement data lakes, data warehouses, and data marts on Azure/AWS , ensuring security, scalability, and cost efficiency. Data Modeling Data Warehousing: Design and implement dimensional models, star/snowflake schemas, and data warehouse best practices . Project Scoping Client Engagement: Collaborate with stakeholders to understand business requirements , define project scope, and deliver tailored data solutions. Team Leadership: Lead a team of data engineers , conduct code reviews, mentor junior engineers, and establish best practices. Automation Performance Optimization: Automate workflows, monitor system performance, and optimize data processes for efficiency. Data Governance Security: Ensure compliance with industry best practices, data privacy laws, and security protocols. Workflow Orchestration: Implement and manage orchestration tools like Apache Airflow, Prefect, or Dagster for automated data workflows. Required Skills Qualifications: 10+ years of experience in data engineering and cloud-based data solutions . Expertise in Snowflake , including performance tuning, data sharing, and security management . Strong experience with Matillion for ETL/ELT development. Proficiency in Azure or AWS Cloud Services (e.g., AWS Redshift, S3, Glue, Azure Synapse, Data Factory). Hands-on experience with BI tools such as Power BI, Tableau . Exceptional SQL skills for query optimization and data modeling . Strong analytical and problem-solving skills to translate business needs into technical solutions. Proven ability to lead teams , mentor engineers, and manage projects effectively. Excellent communication skills to collaborate with clients, stakeholders, and cross-functional teams. Experience with Python or other scripting languages for data processing is a plus. Experience with dbt for data transformation and modeling . Knowledge of workflow orchestration tools like Apache Airflow, Prefect, or Dagster .
Not specified
INR 12.0 - 16.5 Lacs P.A.
Work from Office
Full Time
Not specified
INR 13.0 - 18.0 Lacs P.A.
Work from Office
Full Time
Not specified
INR 13.0 - 18.0 Lacs P.A.
Work from Office
Full Time
Not specified
INR 10.0 - 15.0 Lacs P.A.
Work from Office
Full Time
Not specified
0.0 - 0.0 Lacs P.A.
On-site
Not specified
Not specified
INR 8.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Not specified
INR 9.0 - 13.0 Lacs P.A.
Work from Office
Full Time
Not specified
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Not specified
INR 20.0 - 25.0 Lacs P.A.
Work from Office
Full Time
Not specified
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Not specified
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Not specified
INR 14.0 - 16.0 Lacs P.A.
Work from Office
Full Time
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Chrome Extension
Apexon
0 Jobs | Jacksonville
Redbrick Group
1 Jobs | Chicago
Ascent E Digit Solutions
5 Jobs | Tech City
Inoptra Digital
10 Jobs | Tech City