Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 11.0 years
25 - 40 Lacs
hyderabad, chennai, bengaluru
Hybrid
Role & responsibilities 10+ years of experience in Data space. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team
Posted 3 days ago
7.0 - 11.0 years
5 - 15 Lacs
bengaluru
Hybrid
Role & Responsibilities: Data Modeler with strong Conceptual, Logical and Physical Data Modeling skills, having experience in requirements gathering, creating data mapping documents, writing functional specifications, queries for Data Warehouse, Data Lake, Lakehouse, DataMart, OLTP, OLAP applications. Extensive experience in Entity-Relational, Data Vault and Dimensional Data modeling for creating ER Diagrams using Erwin, ER/Studio, Enterprise Architect and PowerDesigner data modeling tools. Integral part of the Data Management team performing Data Extraction, Data Analysis, Data Cleansing, Data Mapping and Data Profiling exercises using Informatica Analyst and other profiling tools. Worked on industry standard models such as ACORD for insurance, Involved in the Database Design for Operational Data Store working with DBAs and ETL Architects using Data Vault Modeling approach. Design, develop, and implement highly scalable data capture and transformation processes Suggest best modeling approach to the client based on their requirement and target architecture Analyze and understand the Datasets and guide the team in creating Source to Target Mapping and Data Dictionaries,capturing all relevant details Profile the Data sets to generate relevant insights Define Data Modelling Best Practices and ensure implementation of Best Practices across projects Optimize the Data Models and work with the Data Engineers to define the Ingestion logic, ingestion frequency and data consumption patterns Work with QE team to define the right testing strategy Drives automation in modeling activities Collaborate with Business Stakeholders, Data Owners, Business Analysts, Architects to design and develop next generation data platform Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones,impediments etc Create effective ETLs/ELTs to move large volumes of data from various operational systems to dimensional data models for analytics consumption Act as principal designer and reviewer for new data models, make data architectural decisions Work with the database engineering and DBAs to create optimal physical data models (transactions, normalized and dimensional models etc.) Manage data modeling repository, and data modeling process and support model-driven development Define and govern data modeling/design standards, tools, best practices, and related development methodologies Champion Data Lineage, Metadata Management, and Data Quality Analysis processes Collaborate with Business Data Analysts to design and model Business Intelligence Semantic layers for optimized information access Expand and grow data existing platform capabilities to solve new data problems and challenges Ensure all automated processes preserve data integrity by managing the alignment of data availability and integration processes Identify opportunities for new data acquisition and new uses for existing data resources Research and make recommendations for new data management technologies and software engineering practices. Collaborate on decisions around the use of new tools and practices Define data retention policy, establish data governance best practices, and create automated anomaly detection services Document and update business continuity and disaster recovery procedures Engage in ongoing collaboration with data architects, modelers, and other members to achieve common goals Provide guidance to development teams regarding best practices and design patterns for analytics solutions. Coach and provide guidance to junior team members Produce and maintain support documentation for ongoing operations. Act as support to troubleshoot and resolve technical issues with production data models and services Excellent communication, influencing, and facilitation skills, in particular for problem-solving/troubleshooting activities Experience in client-facing roles Assertiveness in dealing with people at all levels within and outside of the delivery organization, including partner and client organizations Required Skills: 10+ years of experience with hybrid data environments that leverage both distributed and relational database technologies to support analytics services (Oracle, IMB DB2, GCP) Solid understanding of data warehousing principles, architecture, and its implementation in complex environments. Good experience in OLTP and OLAP systems Excellent Data Analysis skills Good understanding of one or more ETL tools and data ingestion frameworks. Experience as a designer of complex Dimensional data models for analytics services Experience with various testing methodologies and user acceptance testing. Experience on one or more cloud platforms (e.g. AWS, Azure, GCP) Understanding of Data Quality and Data Governance Understanding of Industry Data Models Experience in leading the large teams Experience with processing large datasets from multiple sources. Ability to operate effectively and independently in a dynamic, fluid environment. Good understanding of agile methodology Strong verbal and written communications skills with experience in relating complex concepts to non-technical users. Demonstrated ability to exchange ideas and convey complex information clearly and concisely Proven ability to lead and drive projects and assignments to completion Exposure to Data Modeling Tools ERwin Power Designer Business Glossary ER/Studio Enterprise Architect, MagicDraw Data Modeller with Any Cloud
Posted 4 days ago
6.0 - 11.0 years
20 - 35 Lacs
hyderabad, chennai, bengaluru
Hybrid
Are you ready to make a difference in Data Space? Looking for immediate joiners - only candidates available to join in September 2025 are eligible to apply. Job Title: Data Modeller & Architect Location: Bengaluru, Chennai, Hyderabad What do we expect? 6-12 years of experience in Data Modelling. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team Contact Amirtha (HR - Aram Hiring) - WhatsApp your resume to 8122080023 / amirtha@aramhiring.com Who is our client: Our Client is a global leader in AI and analytics, helping Fortune 1000 companies solve their toughest challenges. They offer full stack AI and analytics services & solutions to empower businesses to achieve real outcomes and value at scale. They are on a mission to push the boundaries of what AI and analytics can do to help enterprises navigate uncertainty and move forward decisively. Their purpose is to provide certainty to shape a better tomorrow.Our client operates with 4000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare. Many of their team leaders rank in Top 10 and 40 Under 40 lists, exemplifying our dedication to innovation and excellence.We are a Great Place to Work-Certified (2022-24), recognized by analyst firms such as Forrester, Gartner, HFS, Everest, ISG and others. Our client have been ranked among the Best and Fastest Growing analytics firms lists by Inc., Financial Times, Economic Times and Analytics India Magazine. Curious about the role? What your typical day would look like? As an Engineer and Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights.On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Use the Data Modelling tool to create appropriate data models Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Ideate, design, and guide the teams in building automations and accelerators Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Collaborate with Data Engineers to design and develop data extraction and integration code modules. Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. Work with the client to define, establish and implement the right modelling approach as per the requirement Help define the standards and best practices Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. Coach team members, and review code artifacts. Contribute to proposals and RFPs External Skills And Expertise You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry. Kindly share your resume to amirtha@aramhiring.com / 8122080023
Posted 6 days ago
12.0 - 16.0 years
35 - 45 Lacs
hyderabad, chennai, bengaluru
Hybrid
Role - Data Architect - Data Modeling Exp - 12-16 Yrs Locs - Chennai, Hyderabad, Bengaluru, Delhi, Pune Position - Permanent FTE Client - Data Analytics Global Leader Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)
Posted 6 days ago
18.0 - 23.0 years
35 - 40 Lacs
bengaluru
Hybrid
Director - Engineering - CPAAS, Open Source Contributor, Database Internals] About the Team At Cloud Platform as a Service, our core mission is to architect, develop, and maintain a robust, enterprise-grade data and compute platform. This platform is meticulously crafted using cutting-edge open-source technologies, providing the foundational infrastructure upon which numerous product engineering teams at Nutanix build and deliver their exceptional solutions to our valued customers. Your Role Lead & Grow: Hire, mentor, and develop a high-performing engineering organization of 30+ engineers, including first-line managers and staff engineers. Drive Delivery: Collaborate across geographies to plan and execute end-to-end delivery of critical platform projects. Architect & Innovate: Partner with product and architecture teams to define technical vision, influence product strategy, and ensure robust, scalable designs in line with best practices. Cross-Functional Influence: Work closely with Dev and QA teams to align on priorities and drive shared goals. Community & Open Source: Engage with open-source communities and vendorsleveraging existing projects, contributing enhancements, and steering integrations to meet Nutanixs objectives. What You Will Bring Systems & Architecture: Deep understanding of OS internals, networking, containers (Docker, Kubernetes, service mesh), and distributed systems. Strong experience with Linux Languages: Hands-on experience in one of the programming languages in Go, C/C++, Java or Python at scale. Database & Messaging: Knowledge of distributed OLTP/OLAP databases and queuing/caching systems (e.g., NATS, PostgreSQL, Clickhouse, Redis,Cassandra). Open Source: Experience contributing to or maintaining open-source projects.Demonstrated understanding of open-source distributed databases and streaming systems, including the tradeoffs involved in developing clustered, high-performance, and fault-tolerant system software. Qualifications BS/ MS or PhD in Computer Science, Engineering or Equivalent 18+ Years of experience, 7+ years leading and scaling engineering teams, with a track record of hiring, coaching, and performance management. Proven hands-on technical management Experience working in a high-growth multinational company environment Work Arrangement Hybrid: This role operates in a hybrid capacity, blending the benefits of remote work with the advantages of in-person collaboration. For most roles, that will mean coming into an office a minimum of 3 days per week, however certain roles and/or teams may require more frequent in-office presence. Additional team-specific guidance and norms will be provided by your manager.
Posted 6 days ago
6.0 - 10.0 years
25 - 35 Lacs
hyderabad, chennai, bengaluru
Hybrid
Role - Data Modeler/Senior Data Modeler Exp - 6 to 9 Yrs Locs - Chennai, Hyderabad, Bengaluru, Delhi, Pune Position - Permanent Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)
Posted 6 days ago
5.0 - 10.0 years
16 - 31 Lacs
gurugram, bengaluru
Hybrid
Role : Data Modeller Experience: 5-12 Years Location: Gurugram/Bangalore Notice Period: Immediate to 45 Days Your scope of work / key responsibilities: Build and maintain out of standards data models to report disparate data sets in a reliable, consistent and interpretable manner. Gather, distil, and harmonize data requirements and to design coherent Conceptual, logical and physical data models and associated physical feed formats to support these data flows. Articulate business requirements and build source-to-target mappings having complex ETL transformation. Write complex SQL statements and profile source data to validate data transformations. Contribute to requirement analysis and database design - Transactional and Dimensional data modelling. Normalize/ De-normalize data structures, introduce hierarchies and inheritance wherever required in existing/ new data models. Develop and implement data warehouse projects independently. Work with data consumers and data suppliers to understand detailed requirements, and to propose standardized data models. Contribute to improving the Data Management data models. Be an influencer to present and facilitate discussions to understand business requirements and develop dimension data models based on these capabilities and industry best practices. Understanding of Insurance Domain Basic understanding of AWS cloud Good understanding of Master Data Management, Data Quality and Data Governance. Basic understanding of data visualization tools like SAS VA, Tableau Good understanding of implementing & architecting data solutions using the informatica, SQL server/Oracle Key Qualifications and experience: Extensive practical experience in Information Technology and software development projects of with at least 8+ years of experience in designing Operational data store & data warehouse. Extensive experience in any of Data Modelling tools Erwin/ SAP power designer etc. Strong understanding of ETL and data warehouse concepts processes and best practices. Proficient in Data Modelling including conceptual, logical, and physical data modelling for both OLTP and OLAP. Ability to write complex SQL for data transformations and data profiling in source and target systems Basic understanding of SQL vs NoSQL databases. Possess a combination of solid business knowledge and technical expertise with strong communication skills. Demonstrate excellent analytical and logical thinking. Good verbal & written communication skills and Ability to work independently as well as in a team environment providing structure in ambiguous situation. Interested candidates can share their resume at divya@beanhr.com
Posted 1 week ago
7.0 - 11.0 years
13 - 18 Lacs
chennai
Work from Office
About The Role Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Microsoft Azure Analytics Services Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly and efficiently throughout the organization, while also addressing any challenges that arise in the data management process. Your role will be pivotal in shaping the data landscape of the organization, enabling informed decision-making and strategic planning. Roles & Responsibilities:A.Function as the Lead Data Architect for a small, simple project/proposal or as a team lead for medium/large sized project or proposalB.Discuss specific Big data architecture and related issues with client architect/team (in area of expertise)C.Analyze and assess the impact of the requirements on the data and its lifecycleD.Lead Big data architecture and design medium-big Cloud based, Big Data and Analytical Solutions using Lambda architecture.E.Breadth of experience in various client scenarios and situationsF.Experienced in Big Data Architecture-based sales and deliveryG.Thought leadership and innovationH.Lead creation of new data assets & offeringsI.Experience in handling OLTP and OLAP data workloads Professional & Technical Skills: A.Strong experience in Azure is preferred with hands-on experience in two or more of these skills :Azure Synapse Analytics, Azure HDInsight, Azure Databricks with PySpark / Scala / SparkSQL, Azure Analysis ServicesB.Experience in one or more Real-time/Streaming technologies including:Azure Stream Analytics, Azure Data Explorer, Azure Time Series Insights, etc.C.Experience in handling medium to large Big Data implementationsD.Candidate must have around 5 years of extensive Big data experienceE.Candidate must have 15 years of IT experience and around 5 years of extensive Big data experience (design + build) Additional Information:A.Should be able to drive the technology design meetings, propose technology design and architecture B.Should have excellent client communication skillsC.Should have good analytical and problem-solving skills Qualification 15 years full time education
Posted 1 week ago
2.0 - 7.0 years
30 - 35 Lacs
chennai
Work from Office
Position Summary... What youll do... About Team: Walmart s Enterprise Business Services (EBS) is a powerhouse of several exceptional teams delivering world-class technology solutions and services making a profound impact at every level of Walmart. As a key part of Walmart Global Tech, our teams set the bar for operational excellence and leverage emerging technology to support millions of customers, associates, and stakeholders worldwide. Each time an associate turns on their laptop, a customer makes a purchase, a new supplier is onboarded, the company closes the books, physical and legal risk is avoided, and when we pay our associates consistently and accurately, that is EBS. Joining EBS means embarking on a journey of limitless growth, relentless innovation, and the chance to set new industry standards that shape the future of Walmart. What youll do: Guide the team in architectural decisions and best practices for building scalable applications. Drive design, development, implementation and documentation Build, test and deploy cutting edge solutions at scale, impacting associates of Walmart worldwide. Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. Engage with Product Management and Business to drive the agenda, set your priorities and deliver awesome products. Drive the success of the implementation by applying technical skills, to design and build enhanced processes and technical solutions in support of strategic initiatives. Work closely with the Architects and cross functional teams and follow established practices for the delivery of solutions meeting QCD (Quality, Cost & Delivery). Within the established architectural guidelines. Work with senior leadership to chart out the future roadmap of the products Participate in hiring and build teams enabling them to be high performing agile teams. Interact closely for requirements with Business owners and technical teams both within India and across the globe. What youll bring: Bachelors/ Master s degree in Computer Science , engineering, or related field, with minimum 10+ years of experience in software design, development and automated deployments. Hands on experience building Java-based backend systems and experience of working in cloud based solutions is a must . Should be proficient in Java, Spring Boot, Kafka and Spark. Have prior experience in delivering highly scalable large scale data processing Java applications. Strong in high and low level system design. Should be experienced in designing data intensive applications in open stack. A good understanding of CS Fundamentals, Microservices, Data Structures, Algorithms & Problem Solving Should be experienced in CICD development environments/tools including, but not limited to, Git, Maven , Jenkins . Strong in writing modular and testable code and test cases (unit, functional and integration) using frameworks like JUnit, Mockito, and Mock MVC Should be experienced in microservices architecture. Possesses good understanding of distributed concepts, common design principles, design patterns and cloud native development concepts. Hands-on experience in Spring boot, concurrency, garbage collection, RESTful services, data caching services and ORM tools. Experience working with Relational Database and writing complex OLAP, OLTP and SQL queries. Provide multiple alternatives for development frameworks, libraries, and tools. Experience in working with NoSQL Databases like cosmos DB. Experience in working with Caching technology like Redis, Mem cache or other related Systems. Experience in event based systems like Kafka. Experience utilizing monitoring and alert tools like Prometheus, Splunk, and other related systems and excellent in debugging and troubleshooting issues. Exposure to Containerization tools like Docker, Helm, Kubernetes. Knowledge of public cloud platforms like Azure, GCP etc. will be an added advantage. An understanding of Mainframe databases will be an added advantage. About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That s what we do at Walmart Global Tech. We re a team of software engineers, data scientists, cybersecurity experts and service professionals within the world s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is and feels included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions while being inclusive of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelors degree in computer science, computer engineering, computer information systems, software engineering, or related area and 4 years experience in software engineering or related area.Option 2: 6 years experience in software engineering or related area. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Master s degree in Computer Science, Computer Engineering, Computer Information Systems, Software Engineering, or related area and 2 years experience in software engineering or related area Primary Location... Rmz Millenia Business Park, No 143, Campus 1B (1St -6Th Floor), Dr. Mgr Road, (North Veeranam Salai) Perungudi , India
Posted 1 week ago
15.0 - 25.0 years
13 - 18 Lacs
bengaluru
Work from Office
About The Role Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Microsoft Azure Analytics Services Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly and efficiently throughout the organization, while also addressing any challenges that arise in the data management process. Your role will be pivotal in shaping the data landscape of the organization, enabling informed decision-making and strategic planning. Roles & Responsibilities:A.Function as the Lead Data Architect for a small, simple project/proposal or as a team lead for medium/large sized project or proposalB.Discuss specific Big data architecture and related issues with client architect/team (in area of expertise)C.Analyze and assess the impact of the requirements on the data and its lifecycleD.Lead Big data architecture and design medium-big Cloud based, Big Data and Analytical Solutions using Lambda architecture.E.Breadth of experience in various client scenarios and situationsF.Experienced in Big Data Architecture-based sales and deliveryG.Thought leadership and innovationH.Lead creation of new data assets & offeringsI.Experience in handling OLTP and OLAP data workloads Professional & Technical Skills: A.Strong experience in Azure is preferred with hands-on experience in two or more of these skills :Azure Synapse Analytics, Azure HDInsight, Azure Databricks with PySpark / Scala / SparkSQL, Azure Analysis ServicesB.Experience in one or more Real-time/Streaming technologies including:Azure Stream Analytics, Azure Data Explorer, Azure Time Series Insights, etc.C.Experience in handling medium to large Big Data implementationsD.Candidate must have around 5 years of extensive Big data experienceE.Candidate must have 15 years of IT experience and around 5 years of extensive Big data experience (design + build) Additional Information:A.Should be able to drive the technology design meetings, propose technology design and architecture B.Should have excellent client communication skillsC.Should have good analytical and problem-solving skills Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
13 - 18 Lacs
hyderabad
Work from Office
About The Role Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Microsoft Azure Analytics Services Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly and efficiently throughout the organization, while also addressing any challenges that arise in the data management process. Your role will be pivotal in shaping the data landscape of the organization, enabling informed decision-making and strategic planning. Roles & Responsibilities:A.Function as the Lead Data Architect for a small, simple project/proposal or as a team lead for medium/large sized project or proposalB.Discuss specific Big data architecture and related issues with client architect/team (in area of expertise)C.Analyze and assess the impact of the requirements on the data and its lifecycleD.Lead Big data architecture and design medium-big Cloud based, Big Data and Analytical Solutions using Lambda architecture.E.Breadth of experience in various client scenarios and situationsF.Experienced in Big Data Architecture-based sales and deliveryG.Thought leadership and innovationH.Lead creation of new data assets & offeringsI.Experience in handling OLTP and OLAP data workloads Professional & Technical Skills: A.Strong experience in Azure is preferred with hands-on experience in two or more of these skills :Azure Synapse Analytics, Azure HDInsight, Azure Databricks with PySpark / Scala / SparkSQL, Azure Analysis ServicesB.Experience in one or more Real-time/Streaming technologies including:Azure Stream Analytics, Azure Data Explorer, Azure Time Series Insights, etc.C.Experience in handling medium to large Big Data implementationsD.Candidate must have around 5 years of extensive Big data experienceE.Candidate must have 15 years of IT experience and around 5 years of extensive Big data experience (design + build) Additional Information:A.Should be able to drive the technology design meetings, propose technology design and architecture B.Should have excellent client communication skillsC.Should have good analytical and problem-solving skills Qualification 15 years full time education
Posted 1 week ago
6.0 - 16.0 years
0 Lacs
karnataka
On-site
You are a talented and experienced Senior Data Modeler with a minimum of 6+ years of experience in data modeling. In this role, you will be responsible for designing, implementing, and maintaining data models that support business requirements, ensuring high data quality, performance, and scalability. Collaboration with cross-functional teams, including data analysts, architects, and business stakeholders, is essential to align data models with business needs and drive efficient data management. Your key responsibilities will include designing, implementing, and maintaining data models, collaborating with various teams to ensure alignment with business requirements, leveraging expertise in Azure, Databricks, and data warehousing for data solutions, managing and optimizing relational and NoSQL databases, contributing to ETL processes and data integration pipelines, applying data modeling principles and techniques, staying up-to-date with industry trends, and driving best practices and standards for data modeling within the organization. To excel in this role, you need expertise in Azure and Databricks, proficiency in data modeling tools such as ER/Studio and Hackolade, a strong understanding of data modeling principles and techniques, experience with relational and NoSQL databases, familiarity with data warehousing, ETL processes, and data integration, and knowledge of big data technologies like Hadoop and Spark. Industry knowledge in supply chain is preferred but not mandatory. You should possess excellent analytical and problem-solving skills, strong communication skills to interact with technical and non-technical stakeholders, and the ability to work effectively in a collaborative, fast-paced environment. Your educational background should include a B.Tech in any branch or specialization. In summary, as a Senior Data Modeler, you will play a crucial role in designing and maintaining data models that drive efficient data management and align with business requirements, leveraging your expertise in data modeling tools and technologies to enhance data quality, performance, and scalability.,
Posted 1 week ago
1.0 - 5.0 years
5 - 6 Lacs
pune
Work from Office
About Atos Atos is a global leader in digital transformation with c. 78,000 employees and annual revenue of c. 10 billion. European number one in cybersecurity, cloud and high-performance computing, the Group provides tailored end-to-end solutions for all industries in 68 countries. A pioneer in decarbonization services and products, Atos is committed to a secure and decarbonized digital for its clients. Atos is a SE (Societas Europaea) and listed on Euronext Paris. The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space. "Key Responsibilities: Producing high quality software deliverables with limited supervision Analyse, design, code, unit-test, document and implement application releases to Live as part of development team Perform unit tests on volume datasets to evaluate performance and produce optimised solution Ensure timely delivery while complying the organisation standards and given low level designs Proactively notify development manager of risks, bottlenecks, problems, issues, and concerns Work closely with Quality Analysts to deliver quality solution as per the agreed acceptance criteria and timeline Assisting junior developers to overcome technical bottlenecks and perform peer- reviews Working in a fast-paced dynamic team following Agile methodology Essential Experience: Proven track record of full life cycle development of large applications using an Oracle database, basic shell scripting and toolset Solid SQL query writing and PL/SQL skills, problem solving and performance tuning for OLTP system Managing time and changing priorities in a dynamic environment Ability to provide quick turnaround to software issues and management requests Ability to assimilate key issues and concepts and come up to speed quickly Technical Competencies Minimum 8+ for Senior position Strong experience in Oracle Database preferably 10g &above with huge volume (>5 TB) Must have solid development experience in Oracle PL/SQL, SQL, Performance tuning and transaction management. Must have hands on experience of developing: o Advanced SQL, bulk DML and DDL scripts o Packages, procedures, functions, triggers and using cursors, ref cursors, dynamic SQL with bind variable, bulk dml using pl/sql arrays, nested tables and v-arrays o Maintaining ACID properties of transaction management in OLTP system o Exception handling and defining debugging points for troubleshooting o Tables, constraints, indexes, views, sequences and synonyms o Listing direct /indirect dependencies among Oracle objects o Data ingestion using sql-loader, external tables and Oracle AQs o Performance tuning: explain plans, identifying long running jobs, compare explain/execution plans and review AWRs o Creating rerunnable jobs and jobs scheduling using Oracle job scheduler Worked on source code management such as SVN, GIT, Bit Bucket and pipeline creation using Jenkins Experience in working on Unix based OS with some hands-on exposure to Shell scripting"
Posted 1 week ago
13.0 - 17.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Senior Architect - Data Modelling at Tiger Analytics, you will be instrumental in solving complex data management challenges to help organizations become more data-driven. Your role will involve transitioning between being an Individual Contributor, team member, and Architect as required by each project, with a focus on defining, designing, and delivering actionable insights. Your day-to-day responsibilities will include: - Leading the translation of business requirements into conceptual, logical, and physical Data models. - Designing data architectures to encompass the entire data lifecycle, including data organization, acquisition, storage, security, and visualization. - Advising clients on the best modelling approach based on their needs and target architecture. - Analyzing datasets, guiding the team in creating Source to Target Mapping and Data dictionaries, and generating insightful data profiles. - Defining Data Modelling Best Practices and ensuring their implementation across projects. - Optimizing Data Models, defining Ingestion logic, ingestion frequency, and data consumption patterns in collaboration with Data Engineers. - Ensuring adherence to data quality standards, data procedures, and driving automation in modelling activities. - Collaborating with various stakeholders to design and develop next-generation data platforms. - Providing technical expertise and support to troubleshoot and resolve data-related issues, mentoring team members, and collaborating on pre-sales Proof-of-Concepts. - Enabling data catalog and business glossary as per the process defined by the data governance team. Job Requirements: - Minimum 13 years of experience in the data space with hands-on experience in designing data architecture. - Proficiency in RDBMS systems (e.g., Oracle, DB2, SQL Server) and Data Modelling concepts like Relational, Dimensional, Data Vault Modelling. - Experience in developing data models for multiple business domains, OLTP, OLAP systems, Cloud DW, and cloud platforms (e.g., AWS, Azure, GCP). - Hands-on experience in Data Modelling Tools, ETL tools, data ingestion frameworks, NoSQL databases, and Big Data ecosystems. - Familiarity with Data Quality, Data Governance, Industry Data Models, agile methodology, and good communication skills. - Exposure to Python and contribution to proposals and RFPs will be an added advantage. At Tiger Analytics, we value diversity, inclusivity, and individual growth. We encourage you to apply even if you do not meet all the requirements today, as we believe in finding unique roles for every individual. We are an equal-opportunity employer committed to fostering a culture of listening, trust, respect, and personal development. Please note that the job designation and compensation will be based on your expertise and experience, with our compensation packages being highly competitive within the industry.,
Posted 1 week ago
10.0 - 20.0 years
40 - 55 Lacs
chennai
Work from Office
Job Description: Data Modeler (GCP | OLTP & OLAP | Erwin) Position: Data Architect Location: Chennai(WFO) Employment Type: Full-time Experience: 10+ years (minimum 5+ years in Data Modeling) Role Overview We are seeking an experienced Data Modeler with strong expertise in OLTP and OLAP data modeling , Google Cloud Platform (GCP) , and hands-on experience with Erwin Data Modeler . The ideal candidate will be responsible for designing, implementing, and optimizing data models that support both transactional and analytical systems, ensuring scalability, performance, and data integrity. Key Responsibilities Design and develop conceptual, logical, and physical data models for OLTP and OLAP systems. Work with business and technical teams to translate requirements into scalable data architectures. Develop and maintain data dictionaries, metadata, and data lineage documentation. Optimize data models for query performance, scalability, and maintainability. Collaborate with Data Engineers, Architects, and Analysts to implement models in GCP-based data platforms (e.g., BigQuery, Cloud SQL, Cloud Spanner). Conduct data profiling, data quality checks, and impact analysis for schema changes. Ensure data modeling best practices, governance, and compliance standards. Use Erwin Data Modeler (or equivalent tools) for data modeling, version control, and model management. Required Skills & Experience 10+ years of overall experience in data modeling and database design . Strong expertise in OLTP and OLAP systems with deep understanding of normalized and denormalized modeling techniques . Hands-on experience with GCP data services (BigQuery, Cloud SQL, Spanner, Pub/Sub, Dataflow, etc.). Proficiency in Erwin Data Modeler (or similar tools). Strong knowledge of SQL, performance tuning, and query optimization . Experience working in data warehouse, data lake, and cloud-based environments . Familiarity with ETL/ELT frameworks and data pipelines . Excellent communication and collaboration skills. Interested candidates share your profile with jayaprakashn@newtglobalcorp.com for more details 9840239366.
Posted 1 week ago
2.0 - 7.0 years
25 - 30 Lacs
pune
Work from Office
Ensure repeatability and robustness of data movement and storage Execute data purge strategies from the OLTP systems Execute effective error detection and mitigation strategies Ensure update to data related documents Experience youll need to have: Deep Knowledge of Microsoft Azure environment, Managed and non-managed SQL Deep expertise around Microsoft Fabric Proven experience of developing and managing data pipelines Good oral and written communication skills An undergraduate (Bachelor s) degree preferably in Computer Science, Master s degree will be an added advantage 2+ years of experience Post undergrad / master s degree
Posted 2 weeks ago
4.0 - 9.0 years
10 - 14 Lacs
gurugram
Work from Office
Professional Bi Analyst, Sales Analyst (Gurugram/Day Shift) - Excellent English Communication Mandate - BPO Experience Mandate Role and Key Responsibilities: Leverage analytical skills and independent judgement to interpret moderately complex goals, trends, risks, and areas for improvement by collecting, analyzing, and reporting on key metrics and conclusions. May maintain integrity of reports and/or dashboards to allow business to operate accurately and efficiently Use expanded analytic solutions and knowledge to support customer teams and improve efficiencies. Work on projects/matters of moderate complexity in an independent contributor role. Complexity can vary based on several factors such as client size, number of systems, varying levels of established structures, or dynamics of customer and/or data Work cross-functionally and build internal and/or external relationships to ensure high quality data is available for analysis and better business understanding Develop and deliver data-driven insights and recommendations to internal and/or external stakeholders Engage day-to-day with stakeholders for planning, forecasting, and gaining a solid understanding of business questions for appropriate documentation and analysis Work well independently and seek counsel and guidance on more complex projects/matters, as needed. Work is generally reliable on routine tasks and assignments Key Skills and knowledge: Proficient knowledge of expanded analysis solutions/tools (such as OLTP/OLAP data structures, advanced Excel, Tableau, Salesforce, Power BI, Business Objects) Proficient knowledge of domain languages (such as SQL Query, HIVE QL, etc.) Application of moderately complex statistical methods (such as deviations, quartiles, etc.) 2 5 years of related experience to reflect skills and talent necessary for this role preferred May require practical sales motion knowledge May require practical industry and demographic understanding in one of the following: hardware, software, SaaS, healthcare, or industrial May require strong proficiency in all Microsoft Office applications (especially Word, Excel, and PowerPoint) Educational qualification: Bachelors degree/diploma, or the equivalent, preferred. Degree/diploma in computer science, finance, or statistics/mathematics a plusRole & responsibilities Salary upto: 14 LPA Interested, call: Rose (9873538143 / WA : 8595800635) rose2hiresquad@gmail.com
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
udupi, karnataka
On-site
You have more than 7 years of IT experience and have expertise in working with at least two structural databases such as SQL, Oracle, or Postgres, and one NoSQL database. You are capable of collaborating with the Presales team to propose optimal solutions and architectures. Additionally, you possess design experience with BQ, Redshift, and Synapse. Your responsibilities include managing the entire product life cycle, from proposal to delivery, and continuously evaluating architecture enhancements with the delivery team. You are well-versed in security protocols for in-transit data, as well as encryption and decryption of PII data. Moreover, you have a strong understanding of analytics tools for efficient data analysis and have previously been involved in production deployment and support teams. Your technical expertise extends to Big Data tools like Hadoop, Spark, Apache Beam, and Kafka. You are proficient in object-oriented and object function scripting languages such as Python, Java, C++, and Scala. Furthermore, you have hands-on experience in ETL processes and Data Warehousing, along with a comprehensive understanding of both relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, and Cassandra. Your familiarity with cloud platforms such as AWS, GCP, and Azure is an added advantage, and you have experience in workflow management utilizing tools like Apache Airflow. Ideally, you should be knowledgeable about Design Best Practices for OLTP and OLAP Systems and actively contribute to designing databases and pipelines. You should be adept at suggesting appropriate architectures, including Data Warehouse and Datamesh approaches, as well as understanding data sharing and multi-cloud implementation. Additionally, you should have experience in Load testing methodologies, debugging pipelines, and Delta load handling. Prior exposure to heterogeneous migration projects, multiple Cloud platforms, and additional expertise in Load testing methodologies, debugging pipelines, and Delta load handling would be preferred.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
delhi
On-site
We are seeking a highly skilled Java Technical Architect with extensive experience in distributed systems and microservices architecture. As our ideal candidate, you should possess a profound understanding of system design and exhibit the ability to steer the architectural direction of intricate enterprise-grade applications. Your primary responsibilities will include designing scalable systems with key components such as security, observability, and configurability. You will be actively involved in working with the EFK (Elasticsearch, Fluentd, Kibana) stack and implementing sidecar patterns. Additionally, you should be well-versed in SQL and NoSQL databases, demonstrating a strong comprehension of OLAP and OLTP concepts and their practical application in real-time projects. A detailed knowledge of multi-tenancy architecture is also expected. It is essential to stay informed and contribute towards the integration of new Java features like stream gatherers and virtual threads. You will be required to implement efficient data structures and algorithms as part of solution delivery. Preferred qualifications for this role include proficiency in system-level thinking and component integration, a solid understanding of Java and its modern features, and the ability to clearly articulate complex architectural patterns. Hands-on coding skills are necessary, as you should be capable of writing efficient, clean, and scalable code. Exposure to real-world system challenges and microservices pitfalls will be an added advantage. Please note that further interview rounds will involve a more in-depth evaluation of your coding skills and problem-solving abilities.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for designing and implementing various data models, including OLAP, OLTP, dimensional, relational, and data vault models. Your role will involve creating robust data architectures to support business intelligence, data warehousing, and database management solutions. As a Data Modeler, you should have strong experience in data modeling and a solid understanding of OLAP, OLTP, dimensional, relational, and data vault models. Proficiency in data modeling tools such as Erwin, ER Studio, Toad, SQL DBM, and Oracle DBM is essential for this role. Familiarity with SDE (Software Development Environment) is preferred. If you have a passion for data modeling and possess the required expertise in designing and implementing various data models, we encourage you to apply for this position.,
Posted 2 weeks ago
6.0 - 9.0 years
20 - 25 Lacs
bengaluru
Hybrid
Company Description Epsilon is the leader in outcome-based marketing. We enable marketing that's built on proof, not promises. Through Epsilon PeopleCloud, the marketing platform for personalizing consumer journeys with performance transparency, Epsilon helps marketers anticipate, activate and prove measurable business outcomes. Powered by CORE ID, the most accurate and stable identity management platform representing 200+ million people, Epsilon's award-winning data and technology is rooted in privacy by design and underpinned by powerful AI. With more than 50 years of experience in personalization and performance working with the world's top brands, agencies and publishers, Epsilon is a trusted partner leading CRM, digital media, loyalty and email programs. Positioned at the core of Publicis Groupe, Epsilon is a global company with over 8,000 employees in over 40 offices around the world. For more information, visit https://www.epsilon.com/apac (APAC). Follow us on Twitter at @EpsilonMktg. Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. https://www.epsilon.com/apac/youniverse Wondering what it's like to work with Epsilon? Check out this video that captures the spirit of our resilient minds, our values and our great culture. Job Description The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilon's success story. Candidate will be the Senior Software Engineer for Business Intelligence team in the Product Engineering group. The Business Intelligence team partners with internal and external clients and technology providers, to develop, implement, and manage state-of-the-art data analytics, business intelligence and data visualization solutions for our marketing products. The Sr Software Engineer will be an individual with strong technical expertise on business intelligence and analytics solutions/tools and work on the BI strategy in terms of toolset selection, report and visualization best practices, team training, and environment efficiency. Why we are looking for you You are an individual with combination of technical leadership and architectural design skills. You have a solid foundation in business intelligence and analytics solutions/tools. You have experience in Product Engineering & Software Development using Tableau and SAP Business Objects, Kibana Dashboard development. You have experience in data integration tools like Databricks. You excel at collaborating with different stakeholders (ERP, CRM, Data Hub and Business stakeholders) to success. You have a strong experience of building reusable database components using SQL queries You enjoy new challenges and are solution oriented. You like mentoring people and enable collaboration of the highest order What you will enjoy in this role As part of the Epsilon Product Engineering team, the pace of the work matches the fast-evolving demands of Fortune 500 clients across the globe. As part of an innovative team that's not afraid to take risks, your ideas will come to life in digital marketing products that support more than 50% automotive dealers in the US. The open and transparent environment that values innovation and efficiency. Exposure to all the different Epsilon Products where reporting plays a key role for the efficient decision-making abilities to the end users. What you will do Work on our BI strategy in terms of toolset selection, report and visualization best practices, team training, and environment efficiency. Analyze requirements and design data analytics and enterprise reporting solutions in various frameworks (such as Tableau, SAP Business Objects, and others) as part of the enterprise, multi-tier, customer-facing applications. Strong technical hands-on to develop data analytics solutions and enterprise reporting solutions in frameworks (such as Tableau, SAP Business Objects, and Kibana). Good to have scripting skills on Python. Build data integration & aggregate pipelines using Databricks. Provide estimates for BI solutions to be developed and deployed. Develop and support cloud infrastructure for BI solutions including automation, process definition and support documentation as required. Work in an agile environment and align with agile / scrum methodology for development work. Follow Data Management processes and procedures and provide input to the creation of data definitions, business rules and data access methods. Collaborate with database administrators and data warehouse architects on data access patterns to optimize data visualization and processing. Assess and come up with infrastructure design for BI solutions catering to system availability and fault tolerance needs. Establish best practices of workloads on multi-tenant deployments. Document solutions and train implementation and operational support teams. Assess gaps in solutions and make recommendations on how to solve the problem. Understand the priorities of various projects and help steer organizational tradeoffs to help focus on the most important initiatives. Show initiative and take responsibility for decisions that impact project and team goals Qualifications BE/ B. Tech/ MCA only, No correspondence course 7+ years of overall technical hands-on experience with good to have supervisory experience Experience in developing BI solutions in enterprise reporting frameworks Experience in designing semantic layer in reporting frameworks and developing reporting model on an OLTP or OLAP environment. Experience working with large data sets, both structured & unstructured, Datawarehouse and Data lakes. Strong knowledge in multitenancy concepts, object, folder and user group templates and user access models in BI reporting tool frameworks, including single sign-on integrations with identity and access management systems such as Okta. Experience in performing periodic sizing, establishing monitoring, backup and restore procedures catering to MTTR and MTBF expectations. Working knowledge of OLTP and relational database concepts and data warehouse concepts/best practices and data modeling Experience in documenting technical design and procedures, reusable artifacts and provide technical guidance as needed. Familiarity with cloud stack (AWS, Azure) & cloud deployments and tools Ability to work on multiple assignments concurrently.
Posted 2 weeks ago
3.0 - 8.0 years
20 - 25 Lacs
chennai
Work from Office
Responsible to drive Data and Analytical application development and maintenance Design and develop highly efficient Data engineering pipelines and Database systems leveraging Oracle, PL/SQL, Java Demonstrate building highly efficient and performing Data applications catering to Business with higher data accuracy and faster response Optimize performance, fixes bugs to improve Data availability and Accuracy Create Technical Design documents Collaborate with multiple teams to provide technical knowhow, solutions to complex business problems Develop reusable assets, Create knowledge repository Being a team player, ability to collaborate idea sharing in a strong product setting Following Agile SDLC, participating in planning and Scrum boards. Displaying problem-solving skills and Innovative thinking
Posted 2 weeks ago
5.0 - 9.0 years
4 - 7 Lacs
gurugram
Work from Office
Primary Skills SQL (Advanced Level) SSAS (SQL Server Analysis Services) Multidimensional and/or Tabular Model MDX / DAX (strong querying capabilities) Data Modeling (Star Schema, Snowflake Schema) Secondary Skills ETL processes (SSIS or similar tools) Power BI / Reporting tools Azure Data Services (optional but a plus) Role & Responsibilities Design, develop, and deploy SSAS models (both tabular and multidimensional). Write and optimize MDX/DAX queries for complex business logic. Work closely with business analysts and stakeholders to translate requirements into robust data models. Design and implement ETL pipelines for data integration. Build reporting datasets and support BI teams in developing insightful dashboards (Power BI preferred). Optimize existing cubes and data models for performance and scalability. Ensure data quality, consistency, and governance standards. Top Skill Set SSAS (Tabular + Multidimensional modeling) Strong MDX and/or DAX query writing SQL Advanced level for data extraction and transformations Data Modeling concepts (Fact/Dimension, Slowly Changing Dimensions, etc.) ETL Tools (SSIS preferred) Power BI or similar BI tools Understanding of OLAP & OLTP concepts Performance Tuning (SSAS/SQL) Skills: analytical skills,etl processes (ssis or similar tools),collaboration,multidimensional expressions (mdx),power bi / reporting tools,sql (advanced level),sql proficiency,dax,ssas (multidimensional and tabular model),etl,data modeling (star schema, snowflake schema),communication,azure data services,mdx,data modeling,ssas,data visualization
Posted 2 weeks ago
8.0 - 12.0 years
27 - 32 Lacs
bengaluru
Work from Office
1. Strong development knowledge in DB Design & development with 6+ Years to 10 Years experience (Postgres DB) Mandatory 2. Strong hands on writing complex PGSQL, procedure and Functions & prevent blocking and Deadlocks 3. Conduct SQL objects code review & Performance tuning( Mandatory ) 4. having hands on Microsoft SQL and MYSQL DB is an advantage. 5. Strong knowledge in RDBMS and NoSQL Concept with strong logical thinking and solutions (Highly required) 6. Expert in transaction databases (OLTP) and ACID property with handling large scale application databases(Mandatory) 8. Consult application developer to provide suggestion / SQL's / PGSQLs on DB with best solutions 9. Good Communications and written skills
Posted 2 weeks ago
4.0 - 6.0 years
6 - 15 Lacs
navi mumbai
Remote
We are looking for immediate joiner - Work from home opportunity. Key skills: SQL, SSIS,SSAS, Proficient on SQL Server (from 2016 and above), Query Optimization, Database performance, OLTP, OLAP systems, Power BI, Reporting, SSIS, SSRS, Data Storage, Cloud DB.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |