Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 6 days ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 6 days ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 6 days ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Us Lemma Technologies is a software start-up company based in Baner Pune. We are unleashing the power of programmatic AdTech to the DOOH ( Digital out of home ) world. Our Mission is to Transform Digital Out Of Home media to connect Brands with their Consumer by Establishing Authentic and Transparent Standards. Innovation is our DNA and Transparency is our RNA We are Revolutionising the DOOH industry. As an organisation, we successfully deliver brand stories seamlessly across all large format digital screens from DOOH to CTV and even on mobile and desktop devices. We are focussed on connecting DOOH media to mainstream digital, enabling brands to deploy omni-digital strategies through our platform. Roles & Responsibilities Chief Data Scientist /Architect of Lemma Technologies. This role will be responsible to define and execute the technical strategy for adoption of modern AI / ML practices to acquire, process data and provide actional insights to Lemma customers. Good understanding of the entire journey of Data acquisition, Data warehouse, Information Architecture, Dashboard, Reports, Predictive Insights, Adoption of AI / ML and NLP and provide innovative data oriented insights for Lemma customers Deep understanding of Data science and Technology and can recommend adoption of right technical tools and strategies. Expected to be hands on technical expert who will build and guide a technical data team Build, design and implement our highly scalable, fault-tolerant, highly available big data platform to process terabytes of data and provid customers with in-depth analytics. Deep data science and AI/ML hands-on experience to give actionable insights to advertisers/ customers of Lemma Good overview of modern technology stack such as Spark, Hadoop, Kafka, HBase, Hive, Presto etc. Automate high-volume data collection and processing to provide real time data analytics. Customize Lemmas reporting and analytics platform based on customers requirements from customers and deliver scalable, production-ready solutions. Lead multiple projects to develop features for data processing and reporting platform, collaborate with product managers, cross-functional teams, other stakeholders and ensure successful delivery of projects. Leveraging a broad range of Lemmas data architecture strategies and proposing both data flows and storage solutions. Managing Hadoop map reduce and spark jobs & solving any ongoing issues with operating the cluster. Working closely with cross functional teams on improving availability and scalability of large data platform and functionality of Lemma software. Participate in Agile/Scrum processes such as sprint planning, sprint retrospective, backlog grooming, user story management, work item prioritization, etc.. Skills Required 10 to 12+ years of proven experience in designing, implementing, and delivering complex, scalable, and resilient platform and services. Experience in building AI, machine learning, Data Analytics Experience in OLAP (Snowflake, Vertica or similar) would be an added advantage. Ability to understand vague business problems and convert into working solutions. Excellent spoken and written interpersonal skills with a collaborative approach. Dedication to developing high-quality software and products. Curiosity to explore and understand data is a strong plus Deep understanding of Big-Data and distributed systems (MapReduce, Spark, Hive, Kafka, Oozie, Airflow) (ref:hirist.tech) Show more Show less
Posted 6 days ago
2.0 - 5.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Req ID: 310007 We are currently seeking a Digital Engineering Staff Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Data Modeler Position Overview: The Data Modeler will be responsible for designing and implementing data models that support the organization's data management and analytics needs. This role involves collaborating with various stakeholders to understand data sources, relationships, and business requirements, and translating them into effective data structures. Key Responsibilities: Collaborate with Business Analysts: Understand different data sources and their relationships. Prepare Conformed Dimension Matrix: Identify different grains of facts, finalize dimensions, and harmonize data across sources. Create Data Models: Develop Source to Target Mapping (STMs) documentation and custom mappings (both technical and non-technical). Include Transformation Rules: Ensure STMs include pseudo SQL queries for transformation rules. Coordinate Reviews: Work with Data Architects, Product Owners, and Enablement teams to review and approve models, STMs, and custom mappings. Engage with Data Engineers: Clarify any questions related to STMs and custom mappings. Required Technical Skills: Proficiency in SQL: Strong understanding of SQL and database management systems. Data Modeling Tools: Familiarity with tools such as ERwin, IBM InfoSphere Data Architect, or similar. Data Warehousing Concepts: Solid knowledge of data warehousing principles, ETL processes, and OLAP. Data Governance and Compliance: Understanding of data governance frameworks and compliance requirements. Key Competencies: Analytical Skills: Ability to analyze complex data sets and derive meaningful insights. Attention to Detail: Ensure accuracy and consistency in data models. Communication Skills: Effectively collaborate with stakeholders and articulate technical concepts to non-technical team members. Project Management Skills: Ability to prioritize tasks, manage timelines, and coordinate with cross-functional teams. Continuous Learning and Adaptability: Commitment to ongoing professional development and adaptability to changing business needs and technologies. Additional : Problem-Solving Abilities: Innovative solutions to data integration, quality, and performance challenges. Knowledge of Data Modeling Methodologies: Entity-relationship modeling, dimensional modeling, normalization techniques. Familiarity with Business Intelligence Tools: Enhance ability to design data structures that facilitate data analysis and visualization. Preferred Qualifications: Experience in SDLC: Understanding of all phases of the Software Development Life Cycle. Certifications: Relevant certifications in data modeling, data warehousing, or related fields.
Posted 6 days ago
0 years
0 Lacs
Warangal Rural, Telangana, India
On-site
GETEC ist einer der führenden Energieversorger und Contracting-Spezialisten für Industrie und Immobilienwirtschaft in Europa. Unser Werteversprechen "Wir haben die Energie für mehr" ist Leitbild für über 2.400 Mitarbeiter, die mit exzellentem Engineering-Know-how, herausragender regulatorischer Kompetenz, ausgewiesener Handlungsschnelligkeit und umfassender Nachhaltigkeitsexpertise unsere Kunden durch eine immer komplexer werdende Energiewelt navigieren. Deine zukünftige Aufgabe In Deiner Rolle als Reporting & Analytics Specialist spielst Du eine zentrale Rolle bei der Weiterentwicklung unserer datenbasierten Steuerung. Du verbesserst bestehende Reportingstrukturen, treibst die Automatisierung von Datenprozessen voran und schaffst die Grundlage für faktenbasierte Entscheidungen im Finanzbereich. Du entwickelst und pflegst interaktive Reports und Dashboards für die Führungsgremien sowie die Finanz- und Controlling-Anforderungen – mit Power BI und weiteren BI-Tools. Du konzipierst verlässliche Datenmodelle und sorgst dafür, dass die relevanten Daten für Reporting, Planung und Forecasting jederzeit verfügbar, konsistent und nachvollziehbar sind. Du arbeitest mit Finanz- und Transaktionsdaten aus verschiedenen Quellsystemen (z. B. SAP, CRM, ERP) und unterstützt die Qualität und Transparenz unserer Datenbasis. Du bringst Dich in die Weiterentwicklung unserer Dateninfrastruktur ein – inklusive Stammdatenlogik, Datenqualitätsprozesse und Systemanpassungen im Controllingkontext. Du arbeitest eng mit Controlling, Accounting und anderen Fachbereichen zusammen, um datengetriebene Lösungen passgenau auf die Anforderungen des Unternehmens abzustimmen. Du als Mensch und was Du mitbringst Ein gutes Gespür für finanzielle Zusammenhänge, analytisches Denken und Freude daran, aus Daten steuerungsrelevante Erkenntnisse abzuleiten Erfahrung im Finanz- oder Controllingumfeld – idealerweise im Reporting, in der Planung oder in der Arbeit mit BI-Tools Ausgeprägtes Interesse an Datenstrukturen und -modellen sowie die Bereitschaft, sich in Tools wie SQL, Power BI oder relationale Datenbanken einzuarbeiten und diese zur Optimierung von Finanzprozessen zu nutzen Motivation, sich mit ETL-Prozessen und Datenflüssen auseinanderzusetzen, um ein besseres Verständnis für die Herkunft und Verarbeitung von Finanzdaten zu entwickeln Affinität zu technologischen Entwicklungen im BI-Umfeld, idealerweise erste Berührungspunkte mit Microsoft Fabric oder OLAP-Technologien (z. B. Jedox, Lucanet) – oder die Offenheit, sich in diese Themen einzuarbeiten Eine strukturierte, lösungsorientierte Arbeitsweise und Freude an bereichsübergreifender Zusammenarbeit Wir bieten MEHR Du hast bei uns die Möglichkeit, die Energiewende aktiv mitzugestalten. Ab Tag Eins bist Du Teil des Teams und hast die Chance direkt Verantwortung zu übernehmen. Wir ermöglichen Dir eine gute Work-Life-Balance durch eine flexible Arbeitszeitplanung mit Gleitzeit sowie 30 Urlaubstagen im Jahr. Wir bieten unseren Mitarbeitern das Jobrad-Konzept: 365 Tage im Jahr fit halten und zusätzlich die Umwelt schonen. Freue Dich auf die Nutzung zahlreicher Fitness- und Wellness-Angebote bei einer Mitgliedschaft über unseren Kooperationspartner Hansefit. Du möchtest Dich weiterentwickeln? Nutze deinen kostenlosen Zugang zu LinkedIn Learning und bekomme Zugriff auf unzählige Trainings- und Lernvideos. Zudem bieten wir eine Vielzahl von internen und externen Weiterbildungs- und Coaching-Angeboten an. Die Atmosphäre ist Dir wichtig? – Agiles Arbeiten und moderne Bürokonzepte sind für uns eine Selbstverständlichkeit. Durch unser Corporate Benefit Portal erhältst Du Zugang zu verschiedenen Vergünstigungen bei über 800 namhaften Produkt- und Eventanbietern. Wir laden alle Menschen herzlich dazu ein, Teil der GETEC-Familie zu werden und unsere offene und wertschätzende Unternehmenskultur zu erleben. Wir sind überzeugt, dass jeder Einzelne von uns einen Beitrag zur Energiewende leisten kann. Es spielt dabei keine Rolle, woher Du kommst, welches Geschlecht oder welche sexuelle Orientierung Du hast, welchem Glauben Du angehörst oder ob Du Einschränkungen hast. Das GETEC Recruiting Team freut sich auf Deine Kontaktaufnahme und Deine Bewerbung: MAKING A DIFFERENCE FOR GENERATIONS TO COME. Show more Show less
Posted 6 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
By clicking the “Apply” button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’s Privacy Notice and Terms of Use. I further attest that all information I submit in my employment application is true to the best of my knowledge. Job Description: The Future Begins Here: At Takeda, we are leading digital evolution and global transformation. By building innovative solutions and future-ready capabilities, we are meeting the need of patients, our people, and the planet. Bengaluru, the city, which is India’s epicenter of Innovation, has been selected to be home to Takeda’s recently launched Innovation Capability Center. We invite you to join our digital transformation journey. In this role, you will have the opportunity to boost your skills and become the heart of an innovative engine that is contributing to global impact and improvement. At Takeda’s ICC we Unite in Diversity: Takeda is committed to creating an inclusive and collaborative workplace, where individuals are recognized for their backgrounds and abilities they bring to our company. We are continuously improving our collaborators journey in Takeda, and we welcome applications from all qualified candidates. Here, you will feel welcomed, respected, and valued as an important contributor to our diverse team. The Opportunity: As a Data Scientist, you will have the opportunity to apply your analytical skills and expertise to extract meaningful insights from vast amounts of data. We are currently seeking a talented and experienced individual to join our team and contribute to our data-driven decision-making process. Objectives: Collaborate with different business users, mainly Supply Chain/Manufacturing to understand the current state and identify opportunities to transform the business into a data-driven organization. Translate processes, and requirements into analytics solutions and metrics with effective data strategy, data quality, and data accessibility for decision making. Operationalize decision support solutions and drive use adoption as well as gathering feedback and metrics on Voice of Customer in order to improve analytics services. Understand the analytics drivers and data to be modeled as well as apply the appropriate quantitative techniques to provide business with actionable insights and ensure analytics model and data are access to the end users to evaluate “what-if” scenarios and decision making. Evaluate the data, analytical models, and experiments periodically to validate hypothesis ensuring it continues to provide business value as requirements and objectives evolve. Accountabilities: Collaborates with business partners in identifying analytical opportunities and developing BI-related goals and projects that will create strategically relevant insights. Work with internal and external partners to develop analytics vision and programs to advance BI solutions and practices. Understands data and sources of data. Strategizes with IT development team and develops a process to collect, ingest, and deliver data along with proper data models for analytical needs. Interacts with business users to define pain points, problem statement, scope, and analytics business case. Develops solutions with recommended data model and business intelligence technologies including data warehouse, data marts, OLAP modeling, dashboards/reporting, and data queries. Works with DevOps and database teams to ensure proper design of system databases and appropriate integration with other enterprise applications. Collaborates with Enterprise Data and Analytics Team to design data model and visualization solutions that synthesize complex data for data mining and discovery. Assists in defining requirements and facilitates workshops and prototyping sessions. Develops and applies technologies such as machine-learning, deep-learning algorithm to enable advanced analytics product functionality. EDUCATION, BEHAVIOURAL COMPETENCIES AND SKILLS: Bachelors’ Degree, from an accredited institution in Data Science, Statistics, Computer Science, or related field. 3+ years of experience with statistical modeling such as clustering, segmentation, multivariate, regression, etc. and analytics tools such as R, Python, Databricks, etc. required Experience in developing and applying predictive and prescriptive modeling, deep-learning, or other machine learning techniques a plus. Hands-on development of AI solutions that comply with industry standards and government regulations. Great numerical and analytical skills, as well as basic knowledge of Python Analytics packages (Pandas, scikit-learn, statsmodels). Ability to build and maintain scalable and reliable data pipelines that collect, transform, manipulate, and load data from internal and external sources. Ability to use statistical tools to conduct data analysis and identify data quality issues throughout the data pipeline. Experience with BI and Visualization tools (f. e. Qlik, Power BI), ETL, NoSQL and proven design skills a plus. Excellent written and verbal communication skills including the ability to interact effectively with multifunctional teams. Experience with working with agile teams. WHAT TAKEDA CAN OFFER YOU: Takeda is certified as a Top Employer, not only in India, but also globally. No investment we make pays greater dividends than taking good care of our people. At Takeda, you take the lead on building and shaping your own career. Joining the ICC in Bengaluru will give you access to high-end technology, continuous training and a diverse and inclusive network of colleagues who will support your career growth. BENEFITS: It is our priority to provide competitive compensation and a benefit package that bridges your personal life with your professional career. Amongst our benefits are Competitive Salary + Performance Annual Bonus Flexible work environment, including hybrid working Comprehensive Healthcare Insurance Plans for self, spouse, and children Group Term Life Insurance and Group Accident Insurance programs Health & Wellness programs including annual health screening, weekly health sessions for employees. Employee Assistance Program 3 days of leave every year for Voluntary Service in additional to Humanitarian Leaves Broad Variety of learning platforms Diversity, Equity, and Inclusion Programs Reimbursements – Home Internet & Mobile Phone Employee Referral Program Leaves – Paternity Leave (4 Weeks) , Maternity Leave (up to 26 weeks), Bereavement Leave (5 calendar days) ABOUT ICC IN TAKEDA: Takeda is leading a digital revolution. We’re not just transforming our company; we’re improving the lives of millions of patients who rely on our medicines every day. As an organization, we are committed to our cloud-driven business transformation and believe the ICCs are the catalysts of change for our global organization. Li-Hybrid Locations: IND - Bengaluru Worker Type: Employee Worker Sub-Type: Regular Time Type: Full time Show more Show less
Posted 1 week ago
8.0 - 10.0 years
15 - 22 Lacs
Nagpur, Pune
Work from Office
Design, develop, and deploy Power BI solutions; build tabular and multidimensional models; create KPIs; and optimize ETL processes. Write complex DAX/MDX queries. Collaborate with teams, analyze KPIs, and present insights.
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Career Area: Technology, Digital and Data Job Description: Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, you're joining a global team who cares not just about the work we do – but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here – we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. As a Software Engineer you will contribute to development and deployment of Caterpillar’s state-of-the-art digital platform. Competent to perform all programming and development assignments without close supervision; normally assigned the more complex aspects of systems work. Works directly on complex application/technical problem identification and resolution, including responding to off-shift and weekend support calls. Works independently on complex systems or infrastructure components that may be used by one or more applications or systems. Drives application development focused around delivering business valuable features Maintains high standards of software quality within the team by establishing good practices and habits. Identifies and encourage areas for growth and improvement within the team. Mentors junior developers. Communicate with end users and internal customers to help direct development, debugging, and testing of application software for accuracy, integrity, interoperability, and completeness. Performs integrated testing and customer acceptance testing of components that requires careful planning and execution to ensure timely, quality results. The position manages the completion of its own work assignments and coordinates work with others. Based on past experiences and knowledge, the incumbent normally works independently with minimal management input and review of end results. Typical customers include Caterpillar customers, dealers, other external companies who purchase services offered by Caterpillar as well as internal business unit and/or service center groups. The position is challenged to quickly and correctly identify problems that may not be obvious. The incumbent solves problems by determining the best course of action, within departmental guidelines, from many existing solutions. The incumbent sets priorities and establishes a work plan in order to complete broadly defined assignments and achieve desired results. The position participates in brainstorming sessions focused on developing new approaches to meeting quality goals in the measure(s) stated. Job Description: Candidate should have at least 5+ years of experience as a Snowflake SQL developer. Write complex SQL queries to implement ETL(Extract, Transform, Load) processes and for Business Intelligence reporting. Deliver robust solutions through Query optimization ensuring Data Quality. Should have experience in writing Functions and Stored Procedures. Analyze & translate functional specifications /user stories into technical specifications. Good to have experience in Design/ Development in any ETL tool like DataStage and Snaplogic. Problem solving Skills Should communicate with business partners/client. Experience on developing both Paraller and sequencer jobs Strong experience in design & implementation of Data warehousing application processes using ETL tool Experience in SQL & UNIX scripting Experience with Data warehousing concepts Team Player with proven abilities in guiding team members and enabling knowledge sharing among the team Strong problem solving & technical skills coupled with confident decision making for enabling effective solutions leading to high customer satisfaction Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modeling Experience creating process using various operational sources like Snowflake, Oracle, SQLServer, Flat Files, Excel Files, into a staging area Expertise to use Data Stage Designer to develop processes for extracting, transforming, integrating, and loading data into data warehouse system (OLAP) Experience in integration of various data sources (DB2-UDB, SQL Server, Oracle and Flat files) into data staging area High experience creating tables and databases in Snowflake and SQL Server. Experience in developing Data stage mappings using transformations like Transformation, Lookup, Join, Merge, Filter, Funnel, Aggregator, Sort, Oracle connector etc. And Sequence Jobs. Prepare Technical Design Document based on Data Model understanding and S2T mapping requirement. Performance, defect and dependency analysis and Performance tuning in Data Stage jobs and SQL queries. Excellent skills in problem-solving and trouble-shooting capabilities, Quick Learner, highly motivated, result oriented and good team player Posting Dates: June 10, 2025 - June 16, 2025 Caterpillar is an Equal Opportunity Employer. Not ready to apply? Join our Talent Community. Show more Show less
Posted 1 week ago
2.0 years
4 Lacs
India
On-site
Responsibilities As a Fullstack (React and Python) Developer, you will be part of the team consisting of AI/ML Engineers, UI/UX Engineers and GIS Engineers to build end-to-end AI based Analytics Software. You will be responsible for - Designing, developing, testing, deploying, managing & maintaining the backend and frontend for various Modules of the project. Work closely with the machine learning, Image processing and GIS team to integrate the algorithmic output from the backend REST APIs. Participate in UAT, and diagnose & troubleshoot, bugs and application integration issues. Participate in the entire software development lifecycle, from concept to delivery. Write clean, well-documented, and efficient code following best practices and coding standards. Perform code reviews and provide constructive feedback to team members. Create and maintain documentation related to the developed processes and applications. Qualification & Experience - Bachelor's degree in Computer Science, Information Technology, or a related field. - 2-5 years of demonstrable experience designing, building, and working as a Fullstack Engineer for enterprise web applications Ideally, this would include the following: Expert-level proficiency with Python (3.4+), Django (2.1+). Expert-level proficiency with JavaScript (ES6), HTML5 & CSS Expert-level proficiency with ReactJS - Familiarity with common databases (NoSQL such as MongoDB) and data warehousing concepts (OLAP, OLTP) - Understanding of REST concepts and building/interacting with REST APIs - Deep understanding of a few UI concepts: Cross-browser compatibility and implementing responsive web design Hands-on experience with test driven development, using testing libraries like Jest, PyTest and Nose Familiarity with common JS visualization libraries built using D3, Chart.js, Highcharts, etc. Deep understanding of core backend concepts: Develop and design RESTful services and APIs Develop functional databases, applications, and servers to support websites on the back end Performance optimization and multithreading concepts Experience with deploying and maintaining high traffic infrastructure (performance testing is a plus) Experience with containerization tools (e.g., Docker, Kubernetes) is a plus. - Understanding of DevOps practices and continuous integration/continuous deployment (CI/CD) pipelines is a plus. - Familiarity with Agile/Scrum methodologies is a plus. In addition, the ideal candidate would have great problem-solving skills, and familiarity with code versioning tools such as Github. Job Type: Full-time Pay: From ₹411,871.11 per year Benefits: Health insurance Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Ability to commute/relocate: Madhapur, Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Do you have experience with containerization tools (e.g., Docker, Kubernetes) ? Do you have understanding of DevOps practices and continuous integration/continuous deployment (CI/CD) pipelines ? Do you have familiarity with Agile/Scrum methodologies ? Experience: 5years: 2 years (Required) Full-stack development: 2 years (Required) Location: Madhapur, Hyderabad, Telangana (Required) Work Location: In person
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Noida
On-site
Noida, Uttar Pradesh, India;Indore, Madhya Pradesh, India;Bangalore, Karnataka, India;Hyderabad, Telangana, India;Gurgaon, Haryana, India Qualification : Required Proven hands-on experience on designing, developing and supporting Database projects for analysis in a demanding environment. Proficient in database design techniques – relational and dimension designs Experience and a strong understanding of business analysis techniques used. High proficiency in the use of SQL or MDX queries. Ability to manage multiple maintenance, enhancement and project related tasks. Ability to work independently on multiple assignments and to work collaboratively within a team is required. Strong communication skills with both internal team members and external business stakeholders Added Advanatage Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Experience working on Hive or Spark SQL or Redshift or Snowflake will be an added advantage. Experience of working on Linux system Experience of Tableau or Micro strategy or Power BI or any BI tools will be an added advantage. Expertise of programming in Python, Java or Shell Script would be a plus Role : Roles & Responsibilities Be frontend person of the world’s most scalable OLAP product company – Kyvos Insights. Interact with senior-most technical and business people of large enterprises to understand their big data strategy and their problem statements in that area. Create, present, align customers with and implement solutions around Kyvos products for the most challenging enterprise BI/DW problems. Be the Go-To person for customers regarding technical issues during the project. Be instrumental in reading the pulse of the big data market and defining the roadmap of the product. Lead a few small but highly efficient teams of Big data engineers Efficient task status reporting to stakeholders and customer. Good verbal & written communication skills Be willing to work on off hours to meet timeline. Be willing to travel or relocate as per project requirement Experience : 5 to 10 years Job Reference Number : 11078
Posted 1 week ago
3.0 - 6.0 years
6 - 10 Lacs
Noida
On-site
Noida/ Indore/ Bangalore;Bangalore, Karnataka, India;Indore, Madhya Pradesh, India;Gurugram, Haryana, India Qualification : OLAP, Data Engineering, Data warehousing, ETL Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Experience working on Hive or Spark SQL or Redshift or Snowflake Experience in writing and troubleshooting SQL programming or MDX queries Experience of working on Linux Experience in Microsoft Analysis services (SSAS) or OLAP tools Tableau or Micro strategy or any BI tools Expertise of programming in Python, Java or Shell Script would be a plus Skills Required : OLPA, MDX, SQL Role : Be frontend person of the world’s most scalable OLAP product company – Kyvos Insights. Interact with senior-most technical and business people of large enterprises to understand their big data strategy and their problem statements in that area. Create, present, align customers with and implement solutions around Kyvos products for the most challenging enterprise BI/DW problems. Be the Go-To person for prospects regarding technical issues during POV stage. Be instrumental in reading the pulse of the big data market and defining the roadmap of the product. Lead a few small but highly efficient teams of Big data engineers Efficient task status reporting to stakeholders and customer. Good verbal & written communication skills Be willing to work on off hours to meet timeline. Be willing to travel or relocate as per project requirement Experience : 3 to 6 years Job Reference Number : 10350
Posted 1 week ago
3.0 - 6.0 years
0 Lacs
Indore
On-site
Indore, Madhya Pradesh, India;Bangalore, Karnataka, India;Noida, Uttar Pradesh, India Qualification : Pre-Sales Solution Engineer - India Experience areas or Skills : Pre-Sales experience of Software or analytics products Excellent verbal & written communication skills OLAP tools or Microsoft Analysis services (MSAS) Data engineering or Data warehouse or ETL Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Tableau or Micro strategy or any BI tool Hive QL or Spark SQL or PLSQL or TSQL Writing and troubleshooting SQL programming or MDX queries Working on Linux programming in Python, Java or Java Script would be a plus Filling RFP or Questioner from Customer NDA, Success Criteria, Project closure and other Documentation Be willing to travel or relocate as per requirement Role : Acts as main point of contact for Customer contacts involved in the evaluation process Product demonstrations to qualified leads Product demonstrations in support of marketing activity such as events or webinars Own RFP, NDA, PoC success criteria document, POC Closure and other documents Secures alignment on Process and documents with the customer / prospect Owns the technical win phases of all active opportunities Understand Customer domain and database schema Providing OLAP and Reporting solution Work closely with customers for understanding and resolving environment or OLAP cube or reporting related issues Co-ordinate with solutioning team for execution of PoC as per success plan Creates enhancement requests or identify requests for new features on behalf of customers or hot prospects Experience : 3 to 6 years Job Reference Number : 10771
Posted 1 week ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About AutoZone: AutoZone is the nation's leading retailer and a leading distributor of automotive replacement parts and accessories with more than 6,000 stores in US, Puerto Rico, Mexico, and Brazil. Each store carries an extensive line for cars, sport utility vehicles, vans and light trucks, including new and remanufactured hard parts, maintenance items and accessories. We also sell automotive diagnostic and repair software through ALLDATA, diagnostic and repair information through ALLDATAdiy.com, automotive accessories through AutoAnything.com and auto and light truck parts and accessories through AutoZone.com. Since opening its first store in Forrest City, Ark. on July 4, 1979, the company has joined the New York Stock Exchange (NYSE: AZO) and earned a spot in the Fortune 500. AutoZone has been committed to providing the best parts, prices, and customer service in the automotive aftermarket industry. We have a rich culture and history of going the Extra Mile for our customers and our community. At AutoZone you’re not just doing a job; you’re playing a crucial role in creating a better experience for our customers, while creating opportunities to DRIVE YOUR CAREER almost anywhere! We are looking for talented people who are customer focused, enjoy helping others and have the DRIVE to excel in a fast-paced environment! Position Summary The Systems Engineer will design data model solutions and ensure alignment between business and IT strategies, operating models, guiding principles, and software development with a focus on the information layer. The Systems Engineer works across business lines and IT domains to ensure that information is viewed as a corporate asset. This includes its proper data definition, creation, usage, archival, and governance. The Systems Engineer works with other engineers and Data Architects to design overall solutions in accordance with industry best practices, principles and standards. The Systems Engineer strives to create and improve the quality of systems, provide more flexible solutions, and reduce time-to-market. Key Responsibilities Enhance and maintain the AutoZone information strategy. Ensure alignment of programs and projects with the strategic AZ Information Roadmap and related strategies Perform gap analysis between current data structures and target data structures. Enhance and maintain the Enterprise Information Model Work with service architects and application architects to assist with the creation of proper data access and utilization methods. Gather complex business requirements and translate product and project needs into data models supporting long-term solutions. Serve as a technical data strategy expert and lead the creation of technical requirements and design deliverables. Define and communicate data standards, industry best practices, technologies, and architectures. Check conformance to standards and resolve any conflicts by explaining and justifying architectural decisions. Recommend and evaluate new tools and methodologies as needed. Manage, communicate, and improve the data governance framework. Requirements: A systems thinker, able to move fluidly between high-level abstract thinking and detail-oriented implementation, open minded to new ideas, approaches, and technologies A data and fact-driven decision maker, with an ability to make quick decisions under uncertainty when necessary; able to quickly learn new technologies, tools, and organizational structures/strategies Understanding of current industry standard best practices regarding integration, architecture, tools, and processes A self-starter that is naturally inquisitive, requiring only small pieces to the puzzle, across many technologies – new and legacy Excellent written and verbal communication, presentation, and analytical skills, including the ability to effectively communicate complex technical concepts and designs to a broad range of people. Education and/or Experience Bachelor's degree in MIS, Computer Science or similar degree or experience required Minimum 3+ yrs experience and knowledge of database systems such as Oracle, Postgres, UDB/DB2, BigQuery, Spanner, JSON, and Couchbase Minimum 2 years of experience with data requirements gathering, acquisition of data from difference business systems, ingestion of data in GCP using managed services namely BigQuery, DataFlow, Composer, Pub/Sub and other ingestion technologies, curation of the data using DBT or other similar technologies and creating data marts/wide tables for analysis and reporting consumption. Assembling large, complex sets of data that meet non-functional and functional business requirements Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using GCP and SQL technologies Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition. Working with stakeholders including data, design, product, and executive teams and assisting them with data-related technical issues Working with stakeholders including the Executive, Product, Data and Design teams to support their data infrastructure needs while assisting with data-related technical issues. Relational & NoSQL database design capability across OLTP & OLAP Excellent analytical and problem-solving skills Excellent verbal and written communication skills Ability to facilitate modeling sessions and communicate appropriately with IT and business customers Experience with Agile software development methodologies Experience with large-replicated databases across distributed and cloud data centers Our Values: An AutoZoner Always..... PUTS CUSTOMERS FIRST CARES ABOUT PEOPLE STRIVES FOR EXCEPTIONAL PERFORMANCE ENERGIZES OTHERS EMBRACES DIVERSITY HELPS TEAMS SUCCEED Show more Show less
Posted 1 week ago
0.0 - 2.0 years
0 Lacs
Madhapur, Hyderabad, Telangana
On-site
Responsibilities As a Fullstack (React and Python) Developer, you will be part of the team consisting of AI/ML Engineers, UI/UX Engineers and GIS Engineers to build end-to-end AI based Analytics Software. You will be responsible for - Designing, developing, testing, deploying, managing & maintaining the backend and frontend for various Modules of the project. Work closely with the machine learning, Image processing and GIS team to integrate the algorithmic output from the backend REST APIs. Participate in UAT, and diagnose & troubleshoot, bugs and application integration issues. Participate in the entire software development lifecycle, from concept to delivery. Write clean, well-documented, and efficient code following best practices and coding standards. Perform code reviews and provide constructive feedback to team members. Create and maintain documentation related to the developed processes and applications. Qualification & Experience - Bachelor's degree in Computer Science, Information Technology, or a related field. - 2-5 years of demonstrable experience designing, building, and working as a Fullstack Engineer for enterprise web applications Ideally, this would include the following: Expert-level proficiency with Python (3.4+), Django (2.1+). Expert-level proficiency with JavaScript (ES6), HTML5 & CSS Expert-level proficiency with ReactJS - Familiarity with common databases (NoSQL such as MongoDB) and data warehousing concepts (OLAP, OLTP) - Understanding of REST concepts and building/interacting with REST APIs - Deep understanding of a few UI concepts: Cross-browser compatibility and implementing responsive web design Hands-on experience with test driven development, using testing libraries like Jest, PyTest and Nose Familiarity with common JS visualization libraries built using D3, Chart.js, Highcharts, etc. Deep understanding of core backend concepts: Develop and design RESTful services and APIs Develop functional databases, applications, and servers to support websites on the back end Performance optimization and multithreading concepts Experience with deploying and maintaining high traffic infrastructure (performance testing is a plus) Experience with containerization tools (e.g., Docker, Kubernetes) is a plus. - Understanding of DevOps practices and continuous integration/continuous deployment (CI/CD) pipelines is a plus. - Familiarity with Agile/Scrum methodologies is a plus. In addition, the ideal candidate would have great problem-solving skills, and familiarity with code versioning tools such as Github. Job Type: Full-time Pay: From ₹411,871.11 per year Benefits: Health insurance Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Ability to commute/relocate: Madhapur, Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Do you have experience with containerization tools (e.g., Docker, Kubernetes) ? Do you have understanding of DevOps practices and continuous integration/continuous deployment (CI/CD) pipelines ? Do you have familiarity with Agile/Scrum methodologies ? Experience: 5years: 2 years (Required) Full-stack development: 2 years (Required) Location: Madhapur, Hyderabad, Telangana (Required) Work Location: In person
Posted 1 week ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description SSIS ADF ETL Requirements Bachelor’s degree in computer science or related field, or equivalent work experience with 10+ years in SQL Servers and relational databases Strong expertise in data engineering, including querying and optimizing complex data sets, and experience with ETL tools like SSIS and ADF Proficiency in Azure cloud technologies, data visualization and reporting tools (e.g., SSRS, Power BI), and a solid understanding of OLAP structures and relational database design Excellent communication skills and ability to work independently or collaboratively in a team, with preferred experience in data warehousing, .NET, Azure, AWS, and agile environments Job responsibilities Bachelor’s degree in computer science or related field, or equivalent work experience with 6 years in SQL Servers and relational databases Strong expertise in data engineering, including querying and optimizing complex data sets, and experience with ETL tools like SSIS and ADF Proficiency in Azure cloud technologies, data visualization and reporting tools (e.g., SSRS, Power BI), and a solid understanding of OLAP structures and relational database design Excellent communication skills and ability to work independently or collaboratively in a team, with preferred experience in data warehousing, .NET, Azure, AWS, and agile environments What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services. Show more Show less
Posted 1 week ago
7.0 - 12.0 years
25 - 35 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role - Data Modeler/Senior Data Modeler Exp - 5 to 12 Yrs Locs - Hyderabad, Pune, Bengaluru Position - Permanent Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)
Posted 1 week ago
2.0 - 5.0 years
12 - 13 Lacs
Noida
Work from Office
In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotalitys Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. QA Automation Engineer As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems. Responsibilities Develop and Implement Automation Frameworks : Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments. Test Strategy and Execution : Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources. Data Validation : Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic. Performance Testing : Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies. Collaborate with Teams : Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly. Continuous Integration : Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process. Defect Tracking and Reporting : Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner. Test Data Management : Develop strategies for handling large volumes of test data while maintaining data security and privacy. Tool and Technology Evaluation : Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices. Job Qualifications: Requirements and skills At Least 4+ Years Experience Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.). Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing. Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations. Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes. Performance Testing Experience with version control systems like Git Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues. Strong communication and collaboration skills. Attention to detail and a passion for delivering high-quality solutions. Ability to work in a fast-paced environment and manage multiple priorities. Enthusiastic about learning new technologies and frameworks. Experience with the following tools and technologies are desired. QLIK Replicate Matillion ETL Snowflake Data Vault Warehouse Design Power BI Azure Cloud - Including Logic App, Azure Functions, ADF
Posted 1 week ago
1.0 - 3.0 years
3 - 6 Lacs
Chennai
Work from Office
Skill Set Required GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery Data Modeller - Hands-on data modelling for OLTP and OLAP systems. In-Depth knowledge of Conceptual, Logical and Physical data modelling. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema. People with functional knowledge of the mutual fund industry will be a plus. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2