Jobs
Interviews

5951 Data Warehousing Jobs - Page 40

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

5 - 9 Lacs

Noida

Work from Office

5-9 years In Data Engineering, software development such as ELT/ETL, data extraction and manipulation in Data Lake/Data Warehouse environment Expert level Hands to the following: Python, SQL, PySpark DBT and Apache Airflow Postgres/others RDBMS DevOps, Jenkins, CI/CD Data Governance and Data Quality frameworks Data Lakes, Data Warehouse AWS services including S3, SNS, SQS, Lambda, EMR, Glue, Athena, EC2, VPC etc. Source code control - GitHub, VSTS etc. Mandatory Competencies Data on Cloud - Azure Data Lake (ADL) Python - Python Big Data - PySpark DevOps - CI/CD DevOps - Jenkins Beh - Communication Data on Cloud - AWS S3 Database - PostgreSQL DevOps - Github

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Vacancy Name: Power BI Developer | Software Engineer Location Country: India Location City: Kormangala - Bangalore Description: Maintain static set-ups & rate maintenance to facilitate reconciliation of invoices Meet Service Level Agreement targets Work on SmartStreams Transactions, Fees Cost and Invoice Management solution ensuring all service level agreements are met. Key Responsibilities: Design and develop interactive dashboards and reports using Power BI. Connect to various data sources (SQL Server, Excel, SharePoint, etc.) and transform data using Power Query and DAX. Collaborate with business stakeholders to gather requirements and translate them into technical specifications. Optimize data models for performance and scalability. Implement row-level security and data governance best practices. Maintain and troubleshoot existing Power BI reports and dashboards. Integrate Power BI reports into other applications using embedded analytics. Stay updated with the latest Power BI features and best practices. Key Skills: Bachelors degree in Computer Science, Information Systems, or a related field. 3 to 5 years of hands-on experience with Power BI. Strong proficiency in DAX, Power Query (M), and data modelling. Experience with SQL and relational databases. Familiarity with data warehousing concepts and ETL processes. Understanding of business processes and KPIs. Excellent analytical and problem-solving skills. Strong communication and interpersonal skills. Qualifications: Microsoft Certified: Data Analyst Associate (Power BI). Experience with Azure Data Services (e.g., Azure Data Factory, Synapse). Knowledge of Python or R for data analysis. Experience with Agile/Scrum methodologies. Employment Type: Permanent Equality Statement: SmartStream is an equal opportunities employer. We are committed to promoting equality of opportunity and following practices which are free from unfair and unlawful discrimination.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

3 - 7 Lacs

Jharkhand

Remote

Job Summary: We are looking for a skilled Data Engineer with hands-on experience in SAP Data Services (SAP DS) and Snowflake to join our growing data engineering team. In this role, you will be responsible for designing, building, and maintaining data integration pipelines and ETL processes that move and transform data from SAP and other source systems into our Snowflake data warehouse. Key Responsibilities: Design, develop, and manage ETL workflows and jobs using SAP Data Services to extract, transform, and load data from various source systems (especially SAP ERP/SAP BW) into Snowflake. Implement data ingestion, transformation, and load strategies into Snowflake, ensuring high performance and scalability. Create and maintain Snowflake objects (e.g., tables, views, stages, file formats, procedures). Monitor and optimize ETL job performance and troubleshoot data pipeline issues. Ensure data quality, consistency, and reliability throughout the pipeline. Collaborate with business and analytics teams to understand data needs and deliver solutions. Maintain documentation related to data mappings, workflows, job designs, and data dictionaries. Support data governance, compliance, and security initiatives. Required Skills and Qualifications: 3+ years of experience working with SAP Data Services (BODS) for ETL development. 2+ years of hands-on experience with Snowflake SQL development, performance tuning, and architecture. Strong experience with data modeling , especially in a cloud data warehouse environment. Solid understanding of ETL best practices , error handling, and performance optimization. Experience in integrating data from SAP ECC, SAP BW , or other enterprise systems. Strong SQL skills and experience working with structured and semi-structured data (e.g., JSON, XML). Knowledge of data warehousing principles and methodologies. Strong analytical and problem-solving skills.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

8 - 12 Lacs

Noida

Work from Office

Core Technical Skill sets: Oracle PL/SQL developer with version 12+ Experience working on Unix/Linux Willingness to learn new technologies Requirements: 6 to 8 years of exp with hand on experience on Oracle PL/SQL Willingness to learn and understand the business domain Ability to meet client needs without sacrificing deadlines and quality Ability to work effectively within global team Excellent communication and teamwork skills Great attention to detail Analytical mind Degree in Computer Science, Statistics or relevant field Mandatory Competencies Beh - Communication and collaboration Database - Oracle - PL/SQL Packages

Posted 2 weeks ago

Apply

8.0 - 10.0 years

10 - 15 Lacs

Noida

Work from Office

Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. Required Skills & Experience: Experience: 8-10 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies ETL - ETL - Tester QA/QE - QA Automation - ETL Testing Database - PostgreSQL - PostgreSQL Beh - Communication Database - Sql Server - SQL Packages

Posted 2 weeks ago

Apply

5.0 - 9.0 years

11 - 15 Lacs

Noida

Work from Office

Extensive knowledge in SQL/ PLSQL/Oracle/Oracle Forms/Oracle reports/Ms SQL/Power Builder. Create and optimise packages, procedures, functions, Triggers, Views, cursors to develop application. Experience in create/modifies packages, views, procedures, Triggers, Views, cursors and functions. Create new custom Reports, Forms and modify existing Reports according to the requirement. Involve in all phases of the SDLC (Software Development Life Cycle) from analysis, design, development, testing, implementation, maintenance and reporting to the client. Experience on Evaluation, Implementation and support with a skillset of Oracle, PL-SQL. Involve in requirement gathering and coordination with users. Hands on in Oracle version 19c and 21c Strong coding knowledge in PLSQL (Bulk Collect, Recursion, Loops (FOR, WHILE DO), Nested blocks, Exception Handling) Strong Coding knowledge in SQL (Analytical functions, View and Materialized Views, Sys tables and Privileges) Good understanding of Performance tuning concepts like Access Paths, Join methods, Partitioning, etc.. Awareness about Oracle data various dictionary and their significance. Complex Queries : Nested queries, inline views, co-related subqueries Awareness about Hierarchical queries Mandatory Competencies Beh - Communication and collaboration Database - Oracle - Oracle 12c Database - Database Programming - PL/SQL Database - Database Programming - SQL

Posted 2 weeks ago

Apply

2.0 - 7.0 years

7 - 11 Lacs

Noida

Work from Office

Critical Thinking Testing Concepts ETL Testing Python Experience Nice to Have API Understanding & Testing (Manual Automation) UI Automation (Able to identify UI elements programmatically. (This is for Selenium). Detailed Description : Critical Thinking - 5/5 High in logical reasoning and proactiveness Should come up with diverse test cases against requirements Testing Concepts - 5/5 Practice various test design techniques Clarity with priority and Severity Testing life cycle and defect management Understand regression v/s functional SQL/ETL/Batch - 4/5 Able to write SQL statements with aggregate functions and joins Understand data transformation Familiar with data loads and related validations Automation - 3/5 Should be able to solve a given problem programmatically Familiar with coding standards, version control, piepliens Able to identify UI elements programmatically API - 2/5 Understand how API works Various authorization mechanisms Validation of responses

Posted 2 weeks ago

Apply

2.0 - 7.0 years

7 - 12 Lacs

Chennai

Work from Office

Collaborate with business stakeholders and other technical team members to acquire data sources that are most relevant to business needs and goals. Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options. Determine solutions that are best suited to develop a pipeline for a particular data source. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development. Write custom scripts to extract data from unstructured/semi-structured sources. Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders. Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability). Stay current with and adopt new tools and applications to ensure high quality and efficient solutions. Build cross-platform data strategy to aggregate multiple sources and process development datasets. Technical Skills Nice-to-have skills 2+ years of experience with Big Data Management (BDM) for relational and non-relational data (formats like json, xml, Avro, parquet, copybook, etc.) Knowledge of Dev-Ops processes (CI/CD) and infrastructure as code. Knowledge of Master Data Management (MDM) and Data Quality tools. Experience developing REST APIs. Knowledge of key machine learning concepts & MLOPS Qualifications Bachelors degree in computer engineering 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment. 3+ years of experience with setting up and operating data pipelines using Python or SQL 3+ years of advanced SQL Programming: PL/SQL, T-SQL 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions. 2+ years of experience in defining and enabling data quality standards for auditing, and monitoring. Strong analytical abilities and a strong intellectual curiosity In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Deep understanding of REST and good API design. Strong collaboration and teamwork skills & excellent written and verbal communications skills. Self-starter and motivated with ability to work in a fast-paced development environment. Agile experience highly desirable. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Candidate should have 8 to 10 years of total experience in Storage & Backup Domain Technology Able to provide consultancy and recommendation on storage in the below mentioned areas: Recommend definition and assignment of tier profiles based on their performance, availability, recoverability, and serviceability characteristics. Recommend application data placement on storage tiers per profiles. Recommend tiering and archival approaches based on aging, I/O, access, and usage. Recommend thin provisioning approach. Recommend best practices for backup and restore Recommend file system capacity standards, replication systems, and archiving Recommend Storage compaction and de-duplication capabilities to reduce the Storage footprint. Recommend file system folder management. Conduct periodic tests to validate the integrity of the data replication solutions such as failover test to the replicated system and validate functionality. Update Asset Inventory database in the CMDB (Asset Management tool provisioned), in case of hardware part replacement by following approved Change management process.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

40 - 45 Lacs

Hyderabad

Remote

Job Title : Data Architect Location : Hyderabad/ Bangalore/Remote Experience Level: 10-15 years preferred Industry: IT/Software Services/SaaS Company Profile/Website: Pragma Edge | Powering Your Connected Enterprise Bangalore Office Location: 1st floor, IndiQube Platina, 15 Commissariat Road, Ashok Nagar, Bengaluru, Karnataka- 560025 Hyderabad Office Location: Pragma Towers, Plot No.07,Image Gardens Road, Silicon Valley, Madhapur, Hyderabad, TG-500081 Employment Type : Full-time Key Responsibilities Design and implement scalable, secure, and high-performance data architecture solutions tailored for logistics operations. Define data standards, models, and governance policies across heterogeneous data sources (e.g., EDI, ERP, TMS, WMS). Architect and optimize data pipelines to enable real-time analytics and reporting for warehouse management, freight, and inventory systems. Collaborate with business stakeholders to translate operational logistics needs into actionable data strategies. Ensure system reliability, data security, and compliance with relevant regulations. Evaluate and recommend tools and platforms including cloud-based data services (Azure, AWS, GCP). Lead data integration efforts including legacy systems migration and EDI transformations. Required Skills & Qualifications Proven experience as a Data Architect in logistics, transportation, or supply chain domains. Strong understanding of EDI formats, warehouse operations, fleet data, and logistics KPIs. Hands-on experience with data modeling, ETL, ELT, and data warehousing. Expertise in cloud platforms (Azure preferred), relational and NoSQL databases, and BI tools. Knowledge of data governance, security, and data lifecycle management. Familiarity with tools like Informatica, Talend, SQL Server, Snowflake, or BigQuery is a plus. Excellent analytical thinking and stakeholder communication skills.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

2 - 6 Lacs

Chennai

Work from Office

Financial Tracking & Analysis Create and maintain a comprehensive revenue tracker for the year, categorizing top revenue contributors across Developer, Investor/Fund, and Client/Corporate segments Closely monitor collections of invoices to maximize revenue and enable write-backs, directly impacting PGOI Generate and distribute regular debtor reports to teams for collection monitoring Identify critical debtors and develop strategies to resolve long-outstanding debts Track legal cases related to collections, ensuring alignment among all stakeholders Deal Progress Monitoring Provide real-time visibility on deal progression to the City MD through daily follow-ups with agents across business lines Identify and troubleshoot obstacles in deal execution Verify and support the invoicing process to ensure timely completion Create deal status reports that offer forecasting insights for the City MD Cross-Team Collaboration Facilitate "One JLL" opportunities by identifying and supporting cross-business line collaboration Monitor and support smooth operational integration between Qdesq and JLL Maintain regular communication with agents to understand operational challenges and identify solutions Coordinate with data controllers across transaction business lines regarding invoicing and data management Industry Knowledge Development Actively expand industry expertise through understanding complex deal structures Participate in knowledge sessions and voluntarily research industry trends Review relevant articles and reports to stay current with market developments Observe and support proposal submissions and report preparation Operational Support Track execution of strategic initiatives and provide data for course corrections as needed Support the City MD with additional Key Result Areas related to operational efficiency Ensure consistent follow-ups on Work In Progress, invoicing, and collections Execute special projects as assigned by the City MD Qualifications Bachelor's degree in Business Administration, Finance, Real Estate, or related field 3+ years of experience in business analysis, preferably in real estate or professional services Strong analytical skills with proficiency in Excel and data visualization tools Excellent communication and interpersonal abilities Detail-oriented with exceptional organizational skills Ability to work independently while supporting multiple stakeholders Understanding of commercial real estate business operations preferred This position reports directly to the City Managing Director

Posted 2 weeks ago

Apply

3.0 - 8.0 years

9 - 14 Lacs

Gurugram

Remote

Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization.

Posted 2 weeks ago

Apply

10.0 - 20.0 years

20 - 35 Lacs

Chennai, Bengaluru

Hybrid

Experience: 6+ years of experience in project management, with a focus on data engineering, analytics, or machine learning projects. 2+ years of hands-on experience with Data works and familiarity with its core components (Maxcompute,Operation and Maintenance Center,Data/DataService Studio Etc) Strong understanding of Apache Spark, cloud environments (AWS, Azure, or GCP), and data pipeline architectures. Technical Skills: Proficiency in DataWorks, Spark, and Python or Scala. Experience in managing cloud-based data solutions (Azure/AWS/GCP). Familiarity with ETL processes, data warehousing, and big data tools. Project Management Skills: Proven experience managing large-scale projects using Agile or Scrum methodologies. Strong budgeting, resource management, and timeline management skills. Excellent communication skills and ability to engage with both technical teams and business stakeholders.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

5 - 9 Lacs

Gurugram

Work from Office

Role Description As a Senior Data Reporting Services Specialist at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills Skills Requirements: Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Chennai

Work from Office

Technical Skills Experience building data transformation pipelines using DBT and SSIS Moderate programming experience with Python Moderate experience with AWS Glue Strong experience with SQL and ability to write efficient code and manage it through GIT repositories Nice-to-have skills Experience working with SSIS Experience working in a Wealth management industry Experience in agile development methodologies

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Role Description As a Senior Database Administrator at Incedo, you will be responsible for managing and optimizing database management systems (DBMS) such as Oracle or SQL Server. You will work with database architects and developers to ensure that databases are designed and configured correctly. You will be responsible for ensuring the performance, scalability, and security of the DBMS and troubleshooting any issues that arise. You will also work with the security team to ensure that the DBMS is secure and complies with relevant regulations. Roles & Responsibilities: Lead or support the design, development, and deployment of Oracle HFM solutions. Work closely with Finance and IT stakeholders to gather and analyze business requirements. Configure metadata, rules, data forms, and security in HFM Build and maintain integrations between HFM and source systems (e.g., ERPs, data warehouses) Design and generate complex financial reports using Hyperion Financial Reporting Studio and Smart View. Troubleshoot and resolve HFM application issues, including performance and consolidation problems. Support monthly, quarterly, and annual close cycles and reporting processes. Technical Skills Skills Requirements: Experience in Legacy Oracle Hyperion Financial Management (HFM) Proficiency in SQL programming for relational databases such as Oracle Understanding of relational database concepts such as ACID properties, transactions, and normalization. Familiarity with database design and optimization techniques such as indexing, partitioning, and query optimization. Experience with database administration tasks such as backup and recovery, performance tuning, and security management. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

Hyderabad

Work from Office

Key Responsibilities Design conformed star & snowflake schemas , implement SCD2 dimensions and fact tables. Lead Spark (PySpark/Scala) or AWSGlue ELT pipelines from RDSZeroETL/S3 into Redshift. Tune RA3 clusterssort/dist keys, WLM queues, Spectrum partitionsfor subsecond BI queries. Establish dataquality, lineage, and costgovernance dashboards using CloudWatch & Terraform/CDK. Collaborate with Product & Analytics to translate HR KPIs into selfservice data marts. Mentor junior engineers; drive documentation and coding standards. MustHave Skills AmazonRedshift (sort & dist keys, RA3, Spectrum) Spark on EMR/Glue (PySpark or Scala) Dimensional modelling (Kimball), star schema, SCD2 Advanced SQL + Python/Scala scripting AWS IAM, KMS, CloudWatch, Terraform/CDK, CI/CD (GitHub Actions or CodePipeline) NicetoHave dbt, Airflow, Kinesis/Kafka, LakeFormation rowlevel ACLs GDPR / SOC2 compliance exposure AWSDataAnalytics or SolutionsArchitect certification Education B.E./B.Tech in Computer Science, IT, or related field (Master’s preferred but not mandatory). Compensation & Benefits Competitive CTC 25–40 LPA Health insurance for self & dependents Why Join Us? Own a greenfield HR analytics platform with executive sponsorship. Modern AWS stack (RedshiftRA3, LakeFormation, EMRonEKS). Culture of autonomy, fast decisionmaking, and continuous learning. Application Process 30min technical screen 4hour takehome Spark/SQL challenge 90min architecture deep dive Panel interview (leadership & stakeholder communication)

Posted 2 weeks ago

Apply

10.0 - 12.0 years

25 - 35 Lacs

Chennai, Bengaluru

Work from Office

Role : AWS, Snowflake with Data Architect What awaits you/ Job Profile Design and develop scalable data pipelines using AWS services. Integrate diverse data sources and ensure data consistency and reliability. Collaborate with data scientists and other stakeholders to understand data requirements. Implement data security measures and maintain data integrity. Monitor and troubleshoot data pipelines to ensure optimal performance. Optimize and maintain data warehouse and data lake architectures. Create and maintain comprehensive documentation for data engineering processes. What should you bring along Proven experience with Snowflake and SQL. Expert-level proficiency in building and managing data pipelines with Python. Strong experience in AWS cloud services, including Lambda, S3, Glue, and other data-focused services. Exposure Terraform for provisioning and managing infrastructure-as-code on AWS. Proficiency in SQL for querying and modeling large-scale datasets. Hands-on experience with Git for version control and managing collaborative workflows. Familiarity with ETL/ELT processes and tools for data transformation. Strong understanding of data architecture, data modeling, and data lifecycle management. Excellent problem-solving and debugging skills. Strong communication and collaboration skills to work effectively in a global, distributed team environment. Must have technical skill Good understanding of Architecting the data and solution. Data Quality, Governance, Data Security and Data Modelling concepts. Data Modelling, Mapping and Compliance to BaDM. Solutioning on Cloud (AWS) with cloud tools, with good Understanding Snowflake. Defining CI/CD config for GitHub and AWS Terraform for deployment config. Ensuring Architecture evolution with latest technology. Guiding and mentoring team, reviewing code and ensuring development to standards. Good to have Technical skills Strong understanding of ETL Concepts, design patterns, industry best practices. Experience with ETL testing Snowflake Certification. AWS Certification.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Business Analyst (Data Steward) at infoAnalytica Consulting, you will play a crucial role in managing data as a corporate asset and contributing to the data services business unit that serves global customers. Your responsibilities will include identifying and leveraging new, existing, and legacy data assets to increase efficiency and re-use, overseeing data strategy, data warehousing, and database management, and ensuring data quality and value creation for customers. You will be expected to act as a thought leader in defining data acquisition strategies and roadmaps, collaborating with tech teams to organize data access and knowledge repositories, and working closely with internal and external stakeholders to address data needs. Additionally, you will be responsible for building effective MIS systems for data optimization and mentoring key personnel within the data department to drive growth and efficiency. To excel in this role, you must possess exceptional communication skills to understand and convey customer requirements effectively, have at least 3 years of experience in data-heavy roles or demand generation, and demonstrate a strong technology insight to identify and leverage the right tools for data projects. Self-service data preparation mechanisms understanding and interpersonal skills are also essential qualities for success in this collaborative and strategic position. Overall, as a Business Analyst (Data Steward) at infoAnalytica, you will have the opportunity to contribute to a dynamic and inclusive work environment that values ownership, innovation, transparency, and results-driven mindset. Join us in our mission to deliver exceptional data services and make a meaningful impact in the world of marketing research and demand generation.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineering Lead at Kanerika Inc., your primary responsibility will be to design, develop, and implement interactive dashboards and reports using data engineering tools. You will collaborate closely with stakeholders to gather requirements and transform them into impactful data visualizations that offer valuable insights. Your role will also involve extracting, transforming, and loading data from multiple sources into Power BI, ensuring its accuracy and integrity. Your proficiency in Power BI and data analytics will play a crucial role in facilitating informed decision-making and supporting the organization in driving data-centric strategies and initiatives. The ideal candidate for the Data Engineering Lead role is characterized by being a team player with a proactive mindset and a commitment to getting things done. Your curiosity and customer-centric approach motivate you to continually explore new avenues to enhance your contributions. You excel under pressure, maintaining a positive outlook and recognizing that career growth is a continuous journey. You are open to making informed decisions that support your professional development. Along with exceptional communication skills, both written and verbal, you possess a proven ability to create visually compelling designs using tools like Power BI and Tableau that effectively convey the organization's core values. With a background in building high-performing, scalable enterprise applications and teams, you bring creativity and a proactive attitude to the table. Your innovative thinking enables you to devise unique solutions, deliver top-quality results, and ensure customer satisfaction. Having accumulated over eight years of experience in data engineering, you exhibit a strong sense of self-motivation and take ownership of your tasks. Your preference for working independently with minimal supervision underscores your self-reliant nature. You are methodical, process-oriented, and uphold a quality-first approach in your work. Leading mid to large-sized teams and accounts, you consistently leverage constructive feedback mechanisms to enhance productivity, accountability, and team performance. Your track record reflects a results-oriented approach, as evidenced by the successful project deliveries with customer case studies showcased on public platforms. Overall, your blend of skills, attributes, and experiences positions you as an ideal candidate to lead our data engineering team(s). You value inclusivity and seek to join a culture that encourages you to embrace your authentic self. In this role, your responsibilities will include analyzing business requirements, performing GAP analysis between the Data Model and Business Requirements, designing and modeling the Power BI schema, transforming data in Power BI/SQL/ETL tools, creating DAX formulas, reports, and dashboards, writing DAX formulas, crafting SQL queries and stored procedures, designing effective Power BI solutions aligned with business needs, overseeing a team of Power BI developers, integrating data from diverse sources into Power BI for analysis, optimizing report and dashboard performance, collaborating with stakeholders to align Power BI projects with organizational goals, and possessing knowledge of Data Warehousing (essential) with Data Engineering as a plus. To be considered for this role, you should have a B. Tech in Computer Science or equivalent qualification with a minimum of 5+ years of relevant experience. Join Kanerika Inc. and be part of a dynamic, diverse community where your skills are appreciated, your growth is supported, and your contributions have a meaningful impact.,

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As an Odoo Developer at CGI, you will be responsible for developing and customizing solutions on the Odoo Platform v15+. With a minimum of 4 years of experience, you will have a deep understanding of Odoo modules, architecture, APIs, and the ability to integrate Odoo with other systems and data sources. You will work on Odoo deployments with more than 1000 logged-in users, ensuring scalability for a large number of users and transactions. Proficiency in Python is essential, and experience with other programming languages such as Java or Scala is a plus. In this role, you will have the opportunity to analyze and interpret complex data sets, utilize data visualization tools like Superset, and work with technologies such as Cassandra and Presto for data analysis and reporting. Your experience with SQL, relational databases like PostgreSQL or MySQL, ETL tools, and data warehousing concepts will be crucial for success. Familiarity with big data technologies like Hadoop and Spark is advantageous. DevSecOps practices are integral to the role, requiring experience with containerization, Docker, Kubernetes clusters, and CI/CD using GitLab. Knowledge of SCRUM and Agile methodologies is essential, as well as proficiency in Linux/Windows operating systems and tools like Jira, GitLab, and Confluence. As a successful candidate, you will demonstrate strong problem-solving and analytical skills, effective communication and collaboration abilities, attention to detail, and a commitment to data quality. You will thrive in a fast-paced, dynamic environment and contribute to turning meaningful insights into action. At CGI, you will be part of a team that values ownership, teamwork, respect, and belonging. You will have the opportunity to shape your career, develop innovative solutions, and access global capabilities while being supported by leaders who care about your well-being and growth. Join CGI as a partner and contribute to one of the largest IT and business consulting services firms in the world.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Senior Frontend Data Visualization Engineer at Bidgely, you will play a crucial role in creating exceptional UI experiences for energy analytics applications. Leveraging your expertise in React.js and Looker, you will be responsible for developing, optimizing, and maintaining interactive dashboards and web applications. Your focus will be on ensuring seamless production support and deployment while turning data into actionable insights. If you are a problem-solver who thrives in a collaborative environment, we are looking for someone like you. Your key responsibilities will include Frontend Development & Optimization, Post-Release Monitoring & Performance Analysis, Collaboration & Communication, and Documentation & Release Management. You will be developing and maintaining high-performance React.js applications, designing and optimizing Looker dashboards, implementing advanced filtering and drill-down capabilities, and ensuring cross-browser compatibility and responsiveness. Additionally, you will monitor the performance and stability of deployed applications, troubleshoot production issues, collaborate with product teams, and provide technical solutions to stakeholders. To excel in this role, you should have at least 2 years of experience in BI development and Data Analytics in cloud platforms. Proficiency in React.js and Looker, strong SQL skills, experience with REST APIs, and familiarity with CI/CD pipelines are essential. You should also possess excellent collaboration, communication, and problem-solving skills, along with a strong understanding of non-functional requirements related to security, performance, and scale. Experience with Git, Confluence, and Notion for version control and documentation is preferred. In return, Bidgely offers Growth Potential with a Startup, a Collaborative Environment, Unique Tools for your role, Group Health Insurance, Internet/Telephone Reimbursement, Professional Development Allowance, Gratuity, Mentorship Programs from industry experts, and Flexible Work Arrangements. Bidgely is an equal-opportunity employer that values diversity and equal opportunity. Your hiring will be based on your skills, talent, and passion, without any bias towards your background, gender, race, or age. Join us in building a better future and a better workforce at Bidgely, an EVerify employer.,

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

Games24x7 is seeking an experienced and detail-oriented Senior Data Analyst (DA-2) to join the team. The successful candidate will be responsible for analyzing and interpreting complex datasets to provide valuable insights that support strategic decision-making. The responsibilities of the Senior Data Analyst include gathering requirements from stakeholders, managing and prioritizing these requirements, and working on A/B experimentation to draw conclusions with statistical techniques. The candidate will need to convert actionable insights into well-defined projects, understand player behavior, and collaborate with Product, Marketing, and Technology teams to execute projects and measure their success. Additionally, the Senior Data Analyst will develop and maintain comprehensive dashboards and reports, debug, monitor, and troubleshoot various solutions, and ensure data quality standards are implemented to guarantee accuracy, reliability, and reusability of data and solutions. The ideal candidate should have a BE/B.Tech degree in Computer Science/IT from a top college, with a minimum of 4 years of experience in Analytics. Strong proficiency in statistical concepts and techniques such as hypothesis testing, regression analysis, and time series analysis is required. Experience in developing and implementing advanced statistical models and algorithms, such as logistic regression, decision trees, random forests, and gradient boosting, is essential. Hands-on experience with Large Language Models (LLMs) and natural language processing (NLP) techniques is a plus. The candidate must possess exceptional logical and analytical skills, excellent technical and communication abilities to convey complex technical concepts to non-technical stakeholders effectively, and attention to detail. Proficiency in data analysis tools and programming languages like Python, R, SQL, etc., as well as experience with data visualization tools (e.g., Tableau, Power BI) are necessary. Knowledge of data warehousing concepts and working with large datasets is advantageous. The Senior Data Analyst will be based in Bangalore and will play a crucial role in contributing to the organization's success by providing data-driven insights and solutions to drive business decisions.,

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

haryana

On-site

The Client Marketing Analytics team at Citi focuses on transforming data into actionable insights to optimize marketing strategies across various channels and lines of business. The primary goal is to enhance client engagement and ROI through data-driven decisions. This role is situated within Analytics & Information Management (AIM), which is Citi's internal analytics center. As part of this role, you will be responsible for various key tasks, including managing marketing partnerships, evaluating campaign performance, and optimizing ROI. You will also analyze event data to assess its impact on client engagement and ROI, providing recommendations for future events. Additionally, you will be analyzing client engagement across digital platforms and social media to optimize content placement for acquisition and engagement. Developing and maintaining dashboards to monitor marketing performance, delivering real-time insights, and enhancing reporting capabilities using advanced platforms are also crucial responsibilities. You will identify and recommend future marketing strategies, continuously improve processes using Generative AI tools, and translate data into actionable insights regarding consumer behavior to drive targeting and segmentation strategies. Communication of findings to business partners and senior leaders is essential. Furthermore, ensuring that data analysis informs marketing investment decisions and plan activities, analyzing marketing program performance and business initiatives, including forecasting key indicators, and collaborating with internal and external stakeholders to build, implement, track, and improve decision strategies are part of the role. Moreover, you will develop and implement data-driven recommendations using analytical and segmentation methodologies, integrate profitability drivers to make pricing and offer recommendations, and perform ad-hoc analytical requests. Team management involving managing, evaluating, and mentoring a team of analysts, including performance reviews, compensation, and staffing decisions, is also a critical aspect. Appropriately assessing risk in business decisions and ensuring compliance with applicable laws and regulations is part of the responsibilities. Qualifications for this role include a Master's or Bachelor's degree in a quantitative field (Engineering, Technology, Computer Science) with over 10 years of experience in analytical solutions delivery and team leadership. An MBA with a specialization in Analytics or Marketing is also welcome. The ideal candidate should have proven ability to lead and mentor analysts, possess strong analytical and problem-solving skills, experience with large datasets, data warehousing, and data extraction techniques, excellent communication and interpersonal skills, and familiarity with marketing analytics, marketing effectiveness, marketing measurement & optimization, and financial services experience. Experience with marketing automation and web analytics platforms, as well as data visualization tools, is also desired. Please note that this job description provides a high-level overview of the work performed, and other job-related duties may be assigned as required.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Software/Senior Software Engineer at our Bangalore location, you will join our team with 2 to 5 years of experience in Java or .NET, showcasing strong Object-Oriented fundamentals. Your expertise in Oracle/MySQL/MS-SQL with a solid foundation in Databases, exposure to Web Application development, a clear grasp of client-server architecture, and familiarity with Analytics and Data Warehousing will be valuable assets. Your primary responsibility will be to offer hands-on technical leadership for turnkey projects, focusing on large enterprise solutions for clients in the US public sector and healthcare industries. This role demands a deep involvement in engineering design and collaboration with teams to ensure timely delivery of work-products. To qualify for this position, you should hold an Engineering degree with over 65% aggregate (or its CGPA equivalent) and a consistent academic track record exceeding 65% across 10th and 12th grades. Your skill set should include exposure to the complete Software Development Life Cycle (SDLC). Candidates with 2 to 5 years of experience will be considered, and the salary package offered is competitive in the industry. Preference will be given to candidates from top companies specializing in Product/Application Development, with a strong career trajectory. Required skills and experience include proficiency in Java, .NET, OOAD, Design Patterns, Object-Oriented Programming, MySQL, SQL, and Oracle. If you are an enthusiastic Software Engineer with a B.E/B.Tech degree, we encourage you to apply by sending your resume to jobs@augustainfotech.com.,

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies