Home
Jobs

126 Data Warehousing Jobs in Kolkata

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 11.0 years

6 - 7 Lacs

Noida, Kolkata, Pune

Work from Office

8 to 11 Years - ATA Notice period: Immediate to 30 days Mandatory skills: candidate should be on Data Engineer background, Snowflake, SQL, Databricks (All mandatory) Good to have: Python, Terraform. Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. Strong hands-on programming skills in core technologies is a must. Hands on experience in designing scalable solutions. Data engineering, integration, and data modeling experience Can write scalable/performant pipelines, queries, and summaries of data. Has worked with various data systems and tools. Understands analytics and data science workflows and common use cases that leverage their work. Snowflake experience (MUST) Python (Good to have) SQL (MUST) Datawarehouse experience AWS experience (good to have) Data QA / validation skills (to check their work) Matillion, DBT, or other Data tech experience (good to have)

Posted 16 hours ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Kolkata

Work from Office

The ability to be a team player The ability and skill to train other people in procedural and technical topics Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Able to write complex SQL queries ; Having experience in Azure Databricks Preferred technical and professional experience Excellent communication and stakeholder management skills

Posted 1 day ago

Apply

6.0 - 8.0 years

8 - 12 Lacs

Kolkata

Work from Office

Job Summary : We are seeking an experienced Data Engineer with strong expertise in Databricks, Python, PySpark, and Power BI, along with a solid background in data integration and the modern Azure ecosystem. The ideal candidate will play a critical role in designing, developing, and implementing scalable data engineering solutions and pipelines. Key Responsibilities : - Design, develop, and implement robust data solutions using Azure Data Factory, Databricks, and related data engineering tools. - Build and maintain scalable ETL/ELT pipelines with a focus on performance and reliability. - Write efficient and reusable code using Python and PySpark. - Perform data cleansing, transformation, and migration across various platforms. - Work hands-on with Azure Data Factory (ADF) for at least 1.5 to 2 years. - Develop and optimize SQL queries, stored procedures, and manage large data sets using SQL Server, T-SQL, PL/SQL, etc. - Collaborate with cross-functional teams to understand business requirements and provide data-driven solutions. - Engage directly with clients and business stakeholders to gather requirements, suggest optimal solutions, and ensure successful delivery. - Work with Power BI for basic reporting and data visualization tasks. - Apply strong knowledge of data warehousing concepts, modern data platforms, and cloud-based analytics. - Adhere to coding standards and best practices, including thorough documentation and testing (unit, integration, performance). - Support the operations, maintenance, and enhancement of existing data pipelines and architecture. - Estimate tasks and plan release cycles effectively. Required Technical Skills : - Languages & Frameworks : Python, PySpark - Cloud & Tools : Azure Data Factory, Databricks, Azure ecosystem - Databases : SQL Server, T-SQL, PL/SQL - Reporting & BI Tools : Power BI (PBI) - Data Concepts : Data Warehousing, ETL/ELT, Data Cleansing, Data Migration - Other : Version control, Agile methodologies, good problem-solving skills Preferred Qualifications : - Experience with coding in Pysense within Databricks (added advantage) - Solid understanding of cloud data architecture and analytics processes - Ability to independently initiate and lead conversations with business stakeholders

Posted 2 days ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job_Description":" This is a remote position. Overview : 7-12 years of experience using AWS Data Landscape and data Ingestion pipeline. Able to understand and explain the data ingestion from different sources like file, database, applications etc. Build and enhance the Python, PySpark Based Framework for ingestion.Data engineering experience in AWS Data Services like Glue, EMR, Airflow, CloudWatch, Lambda, Step functions, Event triggers. Able to work as a senior engineer with sole interaction point with different business functional teams. Requirements 7\u201312 years of experience in ETL and Data Engineering roles. AWS Glue, PySpark, and Amazon Redshift. Strong command of SQL and procedural programming in cloud or enterprise databases. Deep understanding of data warehousing concepts and data modeling. Proven ability to deliver efficient, well-documented, and scalable data pipelines on AWS. Familiarity with Airflow, AWS Lambda, and other orchestration tools is a plus. AWS Certification (e.g., AWS Data Analytics Specialty) is an advantage. Benefits At Exavalu, we are committed to building a diverse and inclusive workforce. We welcome applications for employment from all qualified candidates, regardless of race, color, gender, national or ethnic origin, age, disability, religion, sexual orientation, gender identity or any other status protected by applicable law. We nurture a culture that embraces all individuals and promotes diverse perspectives, where you can make an impact and grow your career. Exavalu also promotes flexibility depending on the needs of employees, customers and the business. It might be part-time work, working outside normal 9-5 business hours or working remotely. We also have a welcome back program to help people get back to the mainstream after a long break due to health or family reasons. ","Job_Type":"Full time","

Posted 3 days ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Big Query ML Develop, train, evaluate, and deploy machine learning models using Big Query ML. Write complex SQL queries to prepare, transform, and analyze large datasets. Build classification, regression, time-series forecasting, and clustering models within Big Query. Work with business stakeholders to understand and translate them into analytical solutions. Automate ML workflows using scheduled queries, Cloud Composer, or Dataform. Visualize and communicate results to both technical and non-technical stakeholders using Looker, or other BI tools. Optimize performance and cost-efficiency of ML models and queries within BigQuery. Proficiency in SQL and working with large scale data warehousing solutions. Familiarity with ML evaluation metrics, feature engineering, and data preprocessing. Knowledge of Google Cloud Platform (GCP) services like Cloud Storage, Dataflow, Cloud Functions. Strong communication and problem- solving skills.

Posted 3 days ago

Apply

12.0 - 17.0 years

25 - 30 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Role Summary: Aurigo is leading the charge in transforming enterprise SaaS for infrastructure, and AI is at the core of this revolution. With Aurigo Lumina, our AI copilot, we are redefining how infrastructure owners interact with Masterworks, driving smarter, faster, and more efficient decision-making. We have set up AI Labs to build AI-powered software and transform Aurigo into an AI-first company. We have AI initiatives across the products and platforms. We are seeking a highly skilled and Principal Data Engineer (Agentic Architectures) to lead the strategic design, development, and scalable implementation of autonomous AI systems within our organization. This role demands an individual with deep expertise in cutting-edge AI architecture, a strong commitment to responsible AI practices, and a proven ability to drive innovation. The ideal candidate will architect and engineer autonomous decision-making systems at scale that integrate seamlessly with enterprise workflows. You will help build autonomous systems that reason, plan, and act - delivering scalable, intelligent solutions for real-world impact. Key Responsibilities: AI System Design and Architecture: Create autonomous, scalable, and reliable AI systems that function well with cloud computing, enterprise workflows, and sophisticated LLM frameworks. Need to facilitate dynamic, context-aware AI decision-making, specify blueprints for agents, pipelines, and APIs. Customization and Optimization: Create plans for optimizing autonomous AI models for tasks unique to the infrastructure sector, such as capital planning, construction, and operation. Establish procedures for fine tuning LLMs, multi-agent frameworks, and LLMs to conform to general architectural principles and business objectives. Knowledge of Frameworks and Platforms: Evaluate, recommend, and put into practice cutting-edge AI frameworks and tools, with an emphasis on autonomous AI systems, multi-agent frameworks, and LLM-driven decision engines. Encourage the use of cloud platforms for scalable AI implementations, such as AWS Bedrock and AWS SageMaker. Innovation and Research Integration: Lead the charge to advance agentic AI capabilities by integrating R&D projects into production infrastructures. Assess and test self-improving AI systems and new frameworks for architectural viability. Push proofs of concept to production-grade systems with practical limitations, such as performance metrics, latency, and scale. To guarantee safety, dependability, and compliance, put in place guardrails, human-in-the-loop procedures, and validation. Strategic AI Leadership: Technical Lead for interdisciplinary groups of data scientists, developers, and AI engineers in the acceptance and use of cutting-edge cloud and AI systems Construct and enhance essential architectural elements such as: Tool/Function Integration Layer, Agent Orchestration Layer, Data Storage and Retrieval Layer and reasoning layer. Qualifications: Bachelor s or Master s degree in Computer Science, Artificial Intelligence, Machine Learning, or a related field. 12+ years of progressive experience in AI/ML, with a strong track record as a data engineer, AI/ML Architect, or Solutions Lead. 2+ years specifically focused on designing and implementing Gen AI solutions. Experience of building autonomous/agentic AI systems (e.g., multi-agent frameworks, self-optimizing systems, or LLM-driven decision engines). Extensive hands-on experience with autonomous AI tools and frameworks: LangChain, Autogen, CrewAI, or architecting custom agentic frameworks. Expertise with cloud platforms for AI architecture, including AWS Bedrock and AWS SageMaker, with a thorough knowledge of their AI service offerings. Proven experience with LLMOps/MLOps pipelines (e.g., Kubeflow, MLflow) and designing scalable deployment strategies for AI agents in production environments. Datawarehouse/Lakes: BigQuery, Redshift, GCS, S3 Orchestrator & Automation: Kubernetes, Helm, Airflow, Terraform Deep understanding of planning algorithms, decision-making models, and agentic architectures. Ability to collaborate with engineers and product teams to translate requirements into robust, scalable AI solutions. About Aurigo Aurigo is revolutionizing how the world plans, builds, and manages infrastructure projects with Masterworks , our industry-leading enterprise SaaS platform. Trusted by over 300 customers managing $300 billion in capital programs, Masterworks is setting new standards for project delivery and asset management. Recognized as one of the Top 25 AI Companies of 2024 and a Great Place to Work for three consecutive years, we are leveraging artificial intelligence to create a smarter, more connected future for customers in transportation, water and utilities, healthcare, higher education, and the government, with over 40,000 projects across North America. At Aurigo, we don t just develop software we shape the future. If you re excited to join a fast-growing company and collaborate with some of the brightest minds in the industry to solve real-world challenges, let s connect. Competencies

Posted 4 days ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Stack Digital is looking for Quality Engineer-Data (ETL Tester) to join our dynamic team and embark on a rewarding career journey Developing and implementing quality standards. Developing and implementing quality control systems. Monitoring and analyzing quality performance. Inspecting and testing materials, equipment, processes, and products to ensure quality specifications are met. Collaborating with operations managers to develop and implement controls and improvements. Ensuring that workflows, processes, and products comply with safety regulations. Investigating and troubleshooting product or production issues. Developing corrective actions, solutions, and improvements. Reviewing codes, specifications, and processes.

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 7 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Qualifications: 6-8 years of overall data warehouse development/administration experience, with recent 3 years of experience in Snowflake. Strong SQL, Python, JavaScript, Terraform, and cloud (Azure/AWS) expertise. Experience with Azure DevOps CI/CD pipelines or similar technology. Strong understanding of Snowflake Architecture including RBAC, cost optimization, and security best practices

Posted 1 week ago

Apply

10.0 - 15.0 years

15 - 25 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Experience: 10+ Years Job Description: Role Overview: We are seeking an experienced AWS Data & Analytics Architect with a strong background in delivery and excellent communication skills. The ideal candidate will have over 10 years of experience and a proven track record in managing teams and client relationships. You will be responsible for leading data modernization and transformation projects using AWS services. Key Responsibilities: Lead and architect data modernization/transformation projects using AWS services. Manage and mentor a team of data engineers and analysts. Build and maintain strong client relationships, ensuring successful project delivery. Design and implement scalable data architectures and solutions. Oversee the migration of large datasets to AWS, ensuring data integrity and security. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Ensure best practices in data management and governance are followed. Required Skills and Experience: 10+ years of experience in data architecture and analytics. Hands-on experience with AWS services such as Redshift, S3, Glue, Lambda, RDS, and others. Proven experience in delivering 1-2 large data migration/modernization projects using AWS. Strong leadership and team management skills. Excellent communication and interpersonal skills. Deep understanding of data modeling, ETL processes, and data warehousing. Experience with data governance and security best practices. Ability to work in a fast-paced, dynamic environment. Preferred Qualifications: AWS Certified Solutions Architect Professional or AWS Certified Big Data Specialty. Experience with other cloud platforms (e.g., Azure, GCP) is a plus. Familiarity with machine learning and AI technologies.

Posted 1 week ago

Apply

4.0 - 7.0 years

3 - 7 Lacs

Kolkata

Work from Office

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired bya collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizationsunlock the value of technology and build a more sustainable, more inclusive world. Performed root cause analysis on data quality issues Developed Power BI dashboards Define and implement data quality requirements, improving reporting accuracy Resolved discrepancies Formed comprehensive test plans to validate source system changes. Primary Skills Avoid candidates with just reports creating experience. Look for somebody who has good experience in analysing the data and working on large data and analysis. Need to look for candidates from Tier-1 or bigger companies with employee strength of 60,000+ Example ( Accenture, Tech Mahindra, Infosys, Tata Consultancy Services(TCS), Wipro, HCL, Cognizant, KPMG, EY, PwC, Deloitte, GCI, McKinsey & Company, NTT Data, Citius tech, Atos, IBM, Ericsson, tata communications, Nokia, DXC technology, LTI, Fujitsu, orange business services, Huawei Technologies, Dell Technologies, Juniper Networks, Virtusa, Comarch, Amdocs, Comarch, ZTE Corporation, NEC Corporation, Samsung, Telstra, Infinera, NEC Corporation) Location – Bangalore, Pune & Mumbai Experience- 4 to 7 Years

Posted 1 week ago

Apply

5.0 - 9.0 years

8 - 12 Lacs

Kolkata

Work from Office

Educational Bachelor of Engineering,Bachelor Of Technology (Integrated),Bachelor Of Business Adm.,MBA,Bachelor Of Technology,Master Of Comp. Applications,Master of Science (Technology),Master Of Technology,Master Of Engineering Service Line Enterprise Package Application Services Responsibilities We are looking for candidates having a minimum of 5 years of hands on experience in development & implementation using latest versions OBIEE, ODI various modules of Oracle BI Apps. The candidates should have the ability to appreciate user requirements in BI domain and converting the same to design specs and architecting solutions. We also require candidates with experience in Data Modeling for a BI Data Warehouse with an understanding of implications for ETL as well as reporting needs. Candidate should also demonstrate strong analytical skills, problem solving/debugging skills. Location of posting is driven by business needs. Experience and desire to work in a management consulting environment that requires regular travel. Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional : We are looking for candidates having a minimum of 5 years of hands on experience in development & implementation using latest versions OBIEE, ODI various modules of Oracle BI Apps. The candidates should have the ability to appreciate user requirements in BI domain and converting the same to design specs and architecting solutions. We also require candidates with experience in Data Modeling for a BI Data Warehouse with an understanding of implications for ETL as well as reporting needs. Candidate should also demonstrate strong analytical skills, problem solving/debugging skills. Location of posting is driven by business needs. Experience and desire to work in a management consulting environment that requires regular travel. Preferred Skills: Technology-Business Intelligence - Reporting-Oracle Business Intelligence Enterprise Edition 12c Technology-Oracle Fusion Technology-ODI - Oracle Data Integrator

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Kolkata

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Architect, you will provide functional and/or technical expertise to plan, analyze, define, and support the delivery of future functional and technical capabilities for an application or group of applications. You will also assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Lead and oversee the design and implementation of application solutions.- Collaborate with stakeholders to understand requirements and translate them into technical solutions.- Provide guidance and mentorship to junior team members.- Stay updated on industry trends and best practices.- Conduct regular code reviews and ensure adherence to coding standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of ETL processes and data integration.- Experience with data modeling and database design.- Hands-on experience in performance tuning and optimization.- Good To Have Skills: Experience with data warehousing technologies. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Mumbai office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Kolkata

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica MDM Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication among team members and stakeholders. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure effective communication among team members and stakeholders- Implement best practices for application design and configuration Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM- Strong understanding of data integration and master data management- Experience in designing and implementing MDM solutions- Knowledge of data governance principles- Hands-on experience with Informatica MDM tools- Experience in data modeling and data quality management Additional Information:- The candidate should have a minimum of 7.5 years of experience in Informatica MDM- This position is based at our Kolkata office- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 6.0 years

7 - 8 Lacs

Kolkata

Work from Office

Use Talend Open Studio to design, implement, and manage data integration solutions. Develop ETL processes to ensure data is accurately extracted, transformed, and loaded into various systems for analysis.

Posted 1 week ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Kolkata

Work from Office

Design, implement, and maintain Oracle SQL queries and database objects. Optimize database performance, conduct data analysis, and ensure data integrity in Oracle-based systems.

Posted 1 week ago

Apply

5.0 - 8.0 years

16 - 30 Lacs

Kolkata

Hybrid

Data Modeler Hybrid Data Environments Job Summary: We are in search of an experienced Data Modeler who possesses a deep understanding of traditional data stores such as SQL Server and Oracle DB, as well as proficiency in Azure/Databricks cloud environments. The ideal candidate will be adept at comprehending business processes and deriving methods to define analytical data models that support enterprise-level analytics, insights generation, and operational reporting. Key Responsibilities: - Collaborate with business analysts and stakeholders to understand business processes and requirements, translating them into data modeling solutions. - Design and develop logical and physical data models that effectively capture the granularity of data necessary for analytical and reporting purposes. - Migrate and optimize existing data models from traditional on-premises data stores to Azure/Databricks cloud environments, ensuring scalability and performance. - Establish data modeling standards and best practices to maintain the integrity and consistency of the data architecture. - Work closely with data engineers and BI developers to ensure that the data models support the needs of analytical and operational reporting. - Conduct data profiling and analysis to understand data sources, relationships, and quality, informing the data modeling process. - Continuously evaluate and refine data models to accommodate evolving business needs and to leverage new data modeling techniques and cloud capabilities. - Document data models, including entity-relationship diagrams, data dictionaries, and metadata, to provide clear guidance for development and maintenance. - Provide expertise in data modeling and data architecture to support the development of data governance policies and procedures. Qualifications: - Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. - Minimum of 5 years of experience in data modeling, with a strong background in both traditional RDBMS and modern cloud-based data platforms. - Proficiency in SQL and experience with data modelling tools (e.g., ER/Studio, ERwin, PowerDesigner). - Familiarity with Azure cloud services, Databricks, and other big data technologies. - Understanding of data warehousing concepts, including dimensional modeling, star schemas, and snowflake schemas. - Ability to translate complex business requirements into effective data models that support analytical and reporting functions. - Strong analytical skills and attention to detail. - Excellent communication and collaboration abilities, with the capacity to engage with both technical and non-technical stakeholders.

Posted 1 week ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Kolkata

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will require you to balance technical oversight with team management, fostering an environment of innovation and collaboration. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Good To Have Skills: Experience with data integration and ETL processes.- Strong understanding of data warehousing concepts and best practices.- Familiarity with SQL and database management systems.- Experience in project management methodologies and tools. Additional Information:- The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services.- This position is based in Kolkata.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

1.0 - 4.0 years

5 - 9 Lacs

Kolkata

Work from Office

:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are seeking skilled Data Analysts with a minimum of 1 year of experience to join us as freelancers and contribute to impactful projects. Key Responsibilities:. Write clean, efficient code for data processing and transformation. Debug and resolve technical issues. Evaluate and review code to ensure quality and compliance. Required Qualifications:. 1+ year of Data Analysis experience. Strong knowledge of Python, R, or SQL. Proficiency in data visualization tools (e.g., Tableau, Power BI). Statistical analysis expertise. Why Join Us. Competitive pay (‚1000/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. . Shape the future of AI with Soul AI!.

Posted 1 week ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Kolkata

Work from Office

Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.

Posted 1 week ago

Apply

4.0 - 8.0 years

5 - 15 Lacs

Kolkata

Work from Office

DBX - L3 Required Skills

Posted 1 week ago

Apply

5.0 - 7.0 years

19 Lacs

Kolkata, Mumbai, Hyderabad

Work from Office

Reporting to Global Head of Data Operations Role purpose As a Data Engineer, you will be a driving force towards data engineering excellence. Working with other data engineers, analysts, and the architecture function, youll be involved in the building out of a modern data platform using a number of cutting-edge technologies, and in a multi cloud environment, Youll get the opportunity to spread your knowledge and skills across multiple areas, with involvement in a range of different functional areas. As the business grows, we want our staff to grow with us, so therell be plenty of opportunity to learn and upskill in areas such as data pipelines, data integrations, data preparation, data models, analytical and reporting marts. Also, whilst work is often following business requirements and design concepts, youll play a huge part in the continuous development and maturing of design patterns and automation process for others to follow. Accountabilities and main responsibilities In this role, you will be delivering solutions and patterns through Agile methodologies as part of a squad. Youll be collaborating with customers, partners and peers, and will help to identify data requirements. Wed also rely on you to: Help break down large problems into smaller iterative steps Contribute to defining the prioritisation of your squads backlog Build out the modern data platform (data pipelines, data integrations, data preparation, data models, analytical and reporting marts) based on business requirements using agreed design patterns Help determine the most appropriate tool, method and design pattern in order to satisfy the requirement Proactively suggest improvements where they see issues Learn how to prepare our data in order to surface it for use within APIs Learn how to Document, support, manage and maintain the modern data platform built within your squad Learn how to provide guidance and training to downstream consumers of data on how best to use the data in our platform Learn how to support and build new data APIs Contribute to evangelising and educating within Sanne about the better use and value of data Comply with all Sanne policies Any other duties in the scope of the role that the company requires. Qualifications and skills Technical Skills: Data Warehousing and Data Modelling Data Lakes (AWS Lake Formation, Azure Data Lake) Cloud Data Warehouses (AWS Redshift, Azure Synapse, Snowflake) ETL/ELT/ Pipeline tools (AWS Glue, Azure Data Factory, FiveTran, Stitch) Data Message Bus/Pub Sub systems (AWS SNS & SQS Azure ASQ, Kafka, RabbitMQ) Data Programming languages (SQL, Python, Scala, Java) Cloud Workflow Service (AWS Step Functions, Azure Logic Apps, Camuda) Interactive Query Services (AWS Athena, Azure DL Analytics) Event and schedule management (AWS Lambda Functions, Azure Functions) Traditional Microsoft BI Stack (SQLServer, SSIS, SSAS, SSRS) Reporting and visualisation tools (Power BI, QuickSight, Mode) NoSQL & Graph DBs (AWS Neptune, Azure Cosmos, Neo4j) NoSQL & Graph DBs (AWS Neptune, Azure Cosmos, Neo4j) (Desirable) API Management (Desirable) Core Skills: Excellent communication and interpersonal skills Critical Thinking and research capabilities Strong problem-solving skills Ability to plan, and manage your own work loads Work well on own initiative as well as part of a bigger team Working knowledge of Agile Software Development Lifecycles.

Posted 1 week ago

Apply

10.0 - 12.0 years

30 - 35 Lacs

Kolkata

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 week ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Kolkata

Work from Office

Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 week ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Kolkata, Hyderabad, Pune

Work from Office

BI Publisher Developer1We are seeking a skilled BI Publisher Developer with expertise in multilingual template development to join our team. The ideal candidate will be responsible for designing, developing, and maintaining Oracle BI Publisher reports and templates that support multiple languages. This role requires a strong understanding of data integration, report development, and localization techniques. Location - Pune,Hyderabad,Kolkata,Chandigarh

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Kolkata, Hyderabad, Pune

Work from Office

ETL QA tester1 Job Tile ETL QA tester Job Summary: We are looking for an experienced ETL Tester to ensure the quality and integrity of our data processing and reporting systems. The ideal candidate will have a strong background in ETL processes, data warehousing, and experience with Snowflake and Tableau. This role involves designing and executing test plans, identifying and resolving data quality issues, and collaborating with development teams to enhance data processing systems. Key Responsibilities: Design, develop, and execute comprehensive test plans and test cases for ETL processes. Validate data transformation, extraction, and loading processes to ensure accuracy and integrity. Perform data validation and data quality checks using Snowflake and Tableau. Identify, document, and track defects and data quality issues. Collaborate with developers, business analysts, and stakeholders to understand requirements and provide feedback on data-related issues. Create and maintain test data, test scripts, and test environments. Generate and analyze reports using Tableau to validate data accuracy and completeness. Conduct performance testing and optimization of ETL processes. Develop and maintain automated testing scripts and frameworks for ETL testing. Ensure compliance with data governance and security standards. Location - Pune,Hyderabad,Kolkata,Chandigarh

Posted 1 week ago

Apply

Exploring Data Warehousing Jobs in Kolkata

Are you a job seeker looking to dive into the world of data warehousing in Kolkata? Look no further! Kolkata, the cultural capital of India, is also a hub for data warehousing jobs with a growing demand for skilled professionals in this field. With numerous opportunities available, job seekers can find a rewarding career in data warehousing in this vibrant city.

Overview of Data Warehousing Job Market in Kolkata

  • Major Hiring Companies: Some of the major companies in Kolkata hiring for data warehousing roles include TCS, Cognizant, IBM, Capgemini, and Infosys.
  • Expected Salary Ranges: Data warehousing professionals in Kolkata can expect to earn between INR 4-10 lakhs per annum, depending on their experience and expertise.
  • Job Prospects: The job prospects for data warehousing professionals in Kolkata are excellent, with a steady growth in demand for skilled individuals in this field.

Key Industries in Kolkata for Data Warehousing Jobs

  • IT & Technology: Kolkata's IT sector is booming, creating a high demand for data warehousing professionals.
  • Financial Services: Banks and financial institutions in Kolkata are increasingly relying on data warehousing for analytics and insights.
  • Healthcare: The healthcare industry in Kolkata is also embracing data warehousing for better patient care and operational efficiency.

Cost of Living in Kolkata

  • Kolkata offers a relatively affordable cost of living compared to other major cities in India, making it an attractive destination for job seekers.

Remote Work Opportunities and Transportation Options

  • Data warehousing professionals in Kolkata may also explore remote work opportunities, allowing for flexibility in their work arrangements.
  • For those commuting to work, Kolkata offers a variety of transportation options including buses, trams, and the metro.

Emerging Trends in Data Warehousing Technology

With advancements in technology, data warehousing is evolving rapidly. Professionals in Kolkata can stay ahead of the curve by upskilling in areas such as cloud-based data warehousing and big data analytics.

Future Job Market Prospects

The future looks bright for data warehousing jobs in Kolkata, with an increasing need for data-driven insights across industries. Job seekers can expect a growing number of opportunities in this field in the coming years.

If you are passionate about data and analytics, consider exploring data warehousing jobs in Kolkata. Take the first step towards a rewarding career by applying for roles in this dynamic field or upskilling to enhance your expertise. Don't miss out on the exciting opportunities awaiting you in the data warehousing industry in Kolkata!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies