Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
4 - 7 Lacs
Gurugram
Work from Office
Primary Skills SQL (Advanced Level) SSAS (SQL Server Analysis Services) Multidimensional and/or Tabular Model MDX / DAX (strong querying capabilities) Data Modeling (Star Schema, Snowflake Schema) Secondary Skills ETL processes (SSIS or similar tools) Power BI / Reporting tools Azure Data Services (optional but a plus) Role & Responsibilities Design, develop, and deploy SSAS models (both tabular and multidimensional). Write and optimize MDX/DAX queries for complex business logic. Work closely with business analysts and stakeholders to translate requirements into robust data models. Design and implement ETL pipelines for data integration. Build reporting datasets and support BI teams in developing insightful dashboards (Power BI preferred). Optimize existing cubes and data models for performance and scalability. Ensure data quality, consistency, and governance standards. Top Skill Set SSAS (Tabular + Multidimensional modeling) Strong MDX and/or DAX query writing SQL Advanced level for data extraction and transformations Data Modeling concepts (Fact/Dimension, Slowly Changing Dimensions, etc.) ETL Tools (SSIS preferred) Power BI or similar BI tools Understanding of OLAP & OLTP concepts Performance Tuning (SSAS/SQL) Skills: analytical skills,etl processes (ssis or similar tools),collaboration,multidimensional expressions (mdx),power bi / reporting tools,sql (advanced level),sql proficiency,dax,ssas (multidimensional and tabular model),etl,data modeling (star schema, snowflake schema),communication,azure data services,mdx,data modeling,ssas,data visualization
Posted 1 month ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. RCE_Risk Data Engineer/Leads Description – External Job Description: - Our Technology team builds innovative digital solutions rapidly and at scale to deliver the next generation of Financial and Non- Financial services across the globe. The Position is a senior technical, hands-on delivery role, requiring the knowledge of data engineering, cloud infrastructure and platform engineering, platform operations and production support using ground-breaking cloud and big data technologies. The ideal candidate with 3-6 years of relevant experience, will possess strong technical skills, an eagerness to learn, a keen interest on 3 keys pillars that our team support i.e. Financial Crime, Financial Risk and Compliance technology transformation, the ability to work collaboratively in fast-paced environment, and an aptitude for picking up new tools and techniques on the job, building on existing skillsets as a foundation. In this role you will: Ingestion and provisioning of raw datasets, enriched tables, and/or curated, re-usable data assets to enable variety of use cases. Driving improvements in the reliability and frequency of data ingestion including increasing real-time coverage Support and enhancement of data ingestion infrastructure and pipelines. Designing and implementing data pipelines that will collect data from disparate sources across enterprise, and from external sources and deliver it to our data platform. Extract Transform and Load (ETL) workflows, using both advanced data manipulation tools and programmatically manipulation data throughout our data flows, ensuring data is available at each stage in the data flow, and in the form needed for each system, service and customer along said data flow. Identifying and onboarding data sources using existing schemas and where required, conduction exploratory data analysis to investigate and provide solutions. Evaluate modern technologies, frameworks, and tools in the data engineering space to drive innovation and improve data processing capabilities. Core/Must Have Skills. 3-8 years of expertise in designing and implementing data warehouses, data lakes using Oracle Tech Stack (DB: PLSQL) At least 4+ years of experience in Database Design and Dimension modelling using Oracle PLSQL. Should be experience of working PLSQL advanced concepts like ( Materialized views, Global temporary tables, Partitions, PLSQL Packages) Experience in SQL tuning, Tuning of PLSQL solutions, Physical optimization of databases. Experience in writing and tuning SQL scripts including- tables, views, indexes and Complex PLSQL objects including procedures, functions, triggers and packages in Oracle Database 11g or higher. Experience in developing ETL processes – ETL control tables, error logging, auditing, data quality etc. Should be able to implement reusability, parameterization workflow design etc. Advanced working SQL Knowledge and experience working with relational and NoSQL databases as well as working familiarity with a variety of databases (Oracle, SQL Server, Neo4J) Strong analytical and critical thinking skills, with ability to identify and resolve issues in data pipelines and systems. Strong understanding of ETL methodologies and best practices. Collaborate with cross-functional teams to ensure successful implementation of solutions. Experience with OLAP, OLTP databases, and data structuring/modelling with understanding of key data points. Good to have: Experience of working in Financial Crime, Financial Risk and Compliance technology transformation domains. Certification on any cloud tech stack. Experience building and optimizing data pipelines on AWS glue or Oracle cloud. Design and development of systems for the maintenance of the Azure/AWS Lakehouse, ETL process, business Intelligence and data ingestion pipelines for AI/ML use cases. Experience with data visualization (Power BI/Tableau) and SSRS. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. Senior Analyst – Data Engineering Data and Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors The opportunity We are looking for Senior Analyst - Data Engineering. The main purpose of the role is to support cloud and on-prem platform analytics and data engineering projects initiated across engagement teams. The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also onshore facing. This role will be instrumental in designing, developing, and evolving the modern data warehousing solutions and data integration build-outs using cutting edge tools and platforms for both on-prem and cloud architectures. In this role you will be coming up with design specifications, documentation, and development of data migration mappings and transformations for a modern Data Warehouse set up/data mart creation and define robust ETL processing to collect and scrub both structured and unstructured data providing self-serve capabilities (OLAP) in order to create impactful decision analytics reporting. Discipline : Information Management & Analysis Role Type : Data Architecture & Engineering A Data Architect & Engineer at EY: Uses agreed-upon methods, processes and technologies to design, build and operate scalable on-premises or cloud data architecture and modelling solutions that facilitate data storage, integration, management, validation and security, supporting the entire data asset lifecycle. Designs, builds and operates data integration solutions that optimize data flows by consolidating disparate data from multiple sources into a single solution. Works with other Information Management & Analysis professionals, the program team, management and stakeholders to design and build analytics solutions in a way that will deliver business value. Skills Cloud Computing, Business Requirements Definition, Analysis and Mapping, Data Modelling, Data Fabric, Data Integration, Data Quality, Database Management, Semantic Layer Effective Client Communication, Problem solving / critical thinking, Interest and passion for Technology, Analytical Thinking, Collaboration Your Key Responsibilities Evaluating and selecting data warehousing tools for business intelligence, data population, data management, metadata management and warehouse administration for both on-prem and cloud based engagements Strong working knowledge across the technology stack, including ETL, ELT, data analysis, metadata, data quality, audit and design Design, develop, and test in ETL tool environment (GUI/canvas steered tools to create workflows) Experience in design documentation (data mapping, technical specifications, production support, data dictionaries, test cases, etc.) Provides technical guidance to a team of data warehouse and business intelligence developers Coordinate with other technology users to design and implement matters of data governance, data harvesting, cloud implementation strategy, privacy, and security Adhere to ETL/Data Warehouse development Best Practices Responsible for Data orchestration, ingestion, ETL and reporting architecture for both on-prem and cloud (MS Azure/AWS/GCP) Assisting the team with performance tuning for ETL and database processes Skills And Attributes For Success Minimum of 4 years of total experience with Data warehousing/ Business Intelligence field Solid hands-on 3+ years of professional experience with creation and implementation of data warehouses on client engagements and helping create enhancements to a data warehouse Strong knowledge of data architecture for staging and reporting schemas, data models and cutover strategies using industry standard tools and technologies Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) Minimum 3+ years experience in Azure database offerings [Relational, NoSQL, Datawarehouse] 2+ years hands-on experience in various Azure services preferred – Azure Data Factory, Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics ,Azure Analysis Services & Databricks Minimum of 3 years of hands-on database design, modelling and integration experience with relational data sources, such as SQL Server databases, Oracle/MySQL, Azure SQL and Azure Synapse Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development life cycle (SDLC) and rules of product development, such as installation, upgrade and namespace management Solid thoughtfulness, technical and problem solving skills Excellent written and verbal communication skills To qualify for the role, you must have Bachelor’s or equivalent degree in computer science, or related field, required. Advanced degree or equivalent business experience preferred Fact steered and thoughtfulness with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analyzing large volumes of data Relevant work experience of minimum 4 to 6 years in a big 4 or technology/ consulting set up Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-prominent, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with prominent businesses across a range of industries What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success, as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories. Show more Show less
Posted 1 month ago
4.0 - 6.0 years
3 - 7 Lacs
Nagar, Pune
Work from Office
Title : REF64648E - Python developer + Chatbot with 4 - 6 years exp - Pune/Mum/ BNG/ GGN/CHN Assistant Manager - WTS 4 - 6 years of experience as a Python Developer with a strong understanding of Python programming concepts and best practice Bachelors Degree/B.Tech/B.E in Computer Science or a related discipline Design, develop, and maintain robust and scalable Python-based applications, tools, and frameworks that integrate machine learning models and algorithms Demonstrated expertise in developing machine learning solutions, including feature selection, model training, and evaluation Proficiency in data manipulation libraries (e.g., Pandas, NumPy) and machine learning frameworks (e.g., Scikit-learn, TensorFlow, PyTorch, Keras) Experience with web frameworks like Django or Flask Contribute to the architecture and design of data-driven solutions, ensuring they meet both functional and non-functional requirements Experience with databases such as MS-SQL Server, PostgreSQL or MySQL. Solid knowledge of OLTP and OLAP concepts Experience with CI/CD tooling (at least Git and Jenkins) Experience with the Agile/Scrum/Kanban way of working Self-motivated and hard-working Knowledge of performance testing frameworks including Mocha and Jest. Knowledge of RESTful APIs. Understanding of AWS and Azure Cloud services Experience with chatbot and NLU / NLP based application is required Qualifications Bachelors Degree/B.Tech/B.E in Computer Science or a related discipline Job Location
Posted 1 month ago
16.0 - 22.0 years
40 - 55 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities • Minimum 15 years of experience • Deep understanding of data architecture principles, data modelling, data integration, data governance, and data management technologies. • Experience in Data strategies and developing logical and physical data models on RDBMS, NoSQL, and Cloud native databases. • Decent experience in one or more RDBMS systems (such as Oracle, DB2, SQL Server) • Good understanding of Relational, Dimensional, Data Vault Modelling • Experience in implementing 2 or more data models in a database with data security and access controls. • Good experience in OLTP and OLAP systems • Excellent Data Analysis skills with demonstrable knowledge on standard datasets and sources. • Good Experience on one or more Cloud DW (e.g. Snowflake, Redshift, Synapse) • Experience on one or more cloud platforms (e.g. AWS, Azure, GCP) • Understanding of DevOps processes • Hands-on experience in one or more Data Modelling Tools • Good understanding of one or more ETL tool and data ingestion frameworks • Understanding of Data Quality and Data Governance • Good understanding of NoSQL Database and modeling techniques • Good understanding of one or more Business Domains • Understanding of Big Data ecosystem • Understanding of Industry Data Models • Hands-on experience in Python • Experience in leading the large and complex teams • Good understanding of agile methodology • Extensive expertise in leading data transformation initiatives, driving cultural change, and promoting a data driven mindset across the organization. • Excellent Communication skills • Understand the Business Requirements and translate business requirements into conceptual, logical and physical Data models. • Work as a principal advisor on data architecture, across various data requirements, aggregation data lake data models data warehouse etc. • Lead cross-functional teams, define data strategies, and leverage the latest technologies in data handling. • Define and govern data architecture principles, standards, and best practices to ensure consistency, scalability, and security of data assets across projects. • Suggest best modelling approach to the client based on their requirement and target architecture. • Analyze and understand the Datasets and guide the team in creating Source to Target Mapping and Data Dictionaries, capturing all relevant details. • Profile the Data sets to generate relevant insights. • Optimize the Data Models and work with the Data Engineers to define the Ingestion logic, ingestion frequency and data consumption patterns. • Establish data governance practices, including data quality, metadata management, and data lineage, to ensure data accuracy, reliability, and compliance. • Drives automation in modeling activities Collaborate with Business Stakeholders, Data Owners, Business Analysts, Architects to design and develop next generation data platform. • Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones, impediments etc. • Guide /mentor team members, and review artifacts. • Contribute to the overall data strategy and roadmaps. • Propose and execute technical assessments, proofs of concept to promote innovation in the data space.
Posted 1 month ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Experience: 7+ Years Location: Noida-Sector 64 Key Responsibilities: Data Architecture Design: Design, develop, and maintain the enterprise data architecture, including data models, database schemas, and data flow diagrams. Develop a data strategy and roadmap that aligns with the business objectives and ensures the scalability of data systems. Architect both transactional (OLTP) and analytical (OLAP) databases, ensuring optimal performance and data consistency. Data Integration & Management: Oversee the integration of disparate data sources into a unified data platform, leveraging ETL/ELT processes and data integration tools. Design and implement data warehousing solutions, data lakes, and/or data marts that enable efficient storage and retrieval of large datasets. Ensure proper data governance, including the definition of data ownership, security, and privacy controls in accordance with compliance standards (GDPR, HIPAA, etc.). Collaboration with Stakeholders: Work closely with business stakeholders, including analysts, developers, and executives, to understand data requirements and ensure that the architecture supports analytics and reporting needs. Collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines. Technology Leadership: Guide the selection of data technologies, including databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure is a must), and analytics tools. Stay updated on emerging data management technologies, trends, and best practices, and assess their potential application within the organization. Data Quality & Security: Define data quality standards and implement processes to ensure the accuracy, completeness, and consistency of data across all systems. Establish protocols for data security, encryption, and backup/recovery to protect data assets and ensure business continuity. Mentorship & Leadership: Lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management. Provide strategic guidance on data-related projects and initiatives, ensuring that all efforts are aligned with the enterprise data strategy. Required Skills & Experience: Extensive Data Architecture Expertise: Over 7 years of experience in data architecture, data modeling, and database management. Proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions. Strong experience with data integration tools (Azure Tools are a must + any other third party tools), ETL/ELT processes, and data pipelines. Advanced Knowledge of Data Platforms: Expertise in Azure cloud data platform is a must. Other platforms such as AWS (Redshift, S3), Azure (Data Lake, Synapse), and/or Google Cloud Platform (BigQuery, Dataproc) is a bonus. Experience with big data technologies (Hadoop, Spark) and distributed systems for large-scale data processing. Hands-on experience with data warehousing solutions and BI tools (e.g., Power BI, Tableau, Looker). Data Governance & Compliance: Strong understanding of data governance principles, data lineage, and data stewardship. Knowledge of industry standards and compliance requirements (e.g., GDPR, HIPAA, SOX) and the ability to architect solutions that meet these standards. Technical Leadership: Proven ability to lead data-driven projects, manage stakeholders, and drive data strategies across the enterprise. Strong programming skills in languages such as Python, SQL, R, or Scala. Certification: Azure Certified Solution Architect, Data Engineer, Data Scientist certifications are mandatory. Pre-Sales Responsibilities: Stakeholder Engagement: Work with product stakeholders to analyze functional and non-functional requirements, ensuring alignment with business objectives. Solution Development: Develop end-to-end solutions involving multiple products, ensuring security and performance benchmarks are established, achieved, and maintained. Proof of Concepts (POCs): Develop POCs to demonstrate the feasibility and benefits of proposed solutions. Client Communication: Communicate system requirements and solution architecture to clients and stakeholders, providing technical assistance and guidance throughout the pre-sales process. Technical Presentations: Prepare and deliver technical presentations to prospective clients, demonstrating how proposed solutions meet their needs and requirements. Additional Responsibilities: Stakeholder Collaboration: Engage with stakeholders to understand their requirements and translate them into effective technical solutions. Technology Leadership: Provide technical leadership and guidance to development teams, ensuring the use of best practices and innovative solutions. Integration Management: Oversee the integration of solutions with existing systems and third-party applications, ensuring seamless interoperability and data flow. Performance Optimization: Ensure solutions are optimized for performance, scalability, and security, addressing any technical challenges that arise. Quality Assurance: Establish and enforce quality assurance standards, conducting regular reviews and testing to ensure robustness and reliability. Documentation: Maintain comprehensive documentation of the architecture, design decisions, and technical specifications. Mentoring: Mentor fellow developers and team leads, fostering a collaborative and growth-oriented environment. Qualifications: Education: Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Experience: Minimum of 7 years of experience in data architecture, with a focus on developing scalable and high-performance solutions. Technical Expertise: Proficient in architectural frameworks, cloud computing, database management, and web technologies. Analytical Thinking: Strong problem-solving skills, with the ability to analyze complex requirements and design scalable solutions. Leadership Skills: Demonstrated ability to lead and mentor technical teams, with excellent project management skills. Communication: Excellent verbal and written communication skills, with the ability to convey technical concepts to both technical and non-technical stakeholders. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
It's fun to work in a company where people truly BELIEVE in what they are doing! We're committed to bringing passion and customer focus to the business. Job Description About Company : Fractal is a leading AI & analytics organization. We have a strong Fullstack Team with great leaders accelerating the growth. Our people enjoy a collaborative work environment, exceptional training, and career development as well as unlimited growth opportunities. We have a Glassdoor rating of 4/5 and achieve customer NPS of 9/10. If you like working with a curious, supportive, high-performing team, Fractal is the place for you. Responsibilities As a Fullstack Engineer, you would be part of the team consisting of Scrum Master, Cloud Engineers, AI/ML Engineers, and UI/UX Engineers to build end-to-end Data to Decision Systems. You would report to a Senior Fullstack Engineer and will be responsible for - Managing, developing & maintaining the backend and frontend for various Data to Decision projects for our Fortune 500 client Work closely with the data science & engineering team to integrate the algorithmic output from the backend REST APIs Work closely with business and product owners to create dynamic infographics with intuitive user controls Qualifications REQUIRED QUALIFICATIONS: 4+ years of demonstrable experience designing, building, and working as a Fullstack Engineer for enterprise web applications Ideally, this would include the following: Expert-level proficiency with Angular Expert-level proficiency with ReactJS or VueJS Expert-level proficiency with Node.js MongoDB) and data warehousing concepts (OLAP, OLTP) Understanding of REST concepts and building/interacting with REST APIs Deep understanding of a few UI concepts: Cross-browser compatibility and implementing responsive web design familiarity with code versioning tools such as Github Preferred Qualifications Familiarity with Microsoft Azure Cloud Services (particularly Azure Web App, Storage and VM), or familiarity with AWS (EC2 containers) or GCP Services. Familiarity with Github Actions or any other CI/CD tool (e.g., Jenkins) Job Location : BLR/MUM/GUR/PUNE/CHENNAI If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Ahmedabad, Gujarat
On-site
Job Title: Dot Net Developer Location: Gujarat Experience Required: Minimum 5 years post-qualification Employment Type: Full-Time Department: IT / Software Development Key Responsibilities: Design, develop, and maintain scalable and secure Client-Server and distributed web applications using Microsoft .NET technologies. Collaborate with cross-functional teams (analysts, testers, developers) to implement project requirements. Ensure adherence to architectural and coding standards and apply best practices in .NET stack development. Integrate applications with third-party libraries and RESTful APIs for seamless data sharing. Develop and manage robust SQL queries, stored procedures, views, and functions using MS SQL Server. Implement SQL Server features such as replication techniques, Always ON, and database replication. Develop and manage ETL workflows, SSIS packages, and SSRS reports. (Preferred) Develop OLAP solutions for advanced data analytics. Participate in debugging and troubleshooting complex issues to deliver stable software solutions. Support IT application deployment and ensure smooth post-implementation functioning. Take ownership of assigned tasks and respond to changing project needs and timelines. Quickly adapt and learn new tools, frameworks, and technologies as required. Technical Skills Required: .NET Framework (4.0/3.5/2.0), C#, ASP.NET, MVC Bootstrap, jQuery, HTML/CSS Multi-layered architecture design Experience with RESTful APIs and third-party integrations MS SQL Server – Advanced SQL, Replication, SSIS, SSRS Exposure to ETL and OLAP (added advantage) Soft Skills: Excellent problem-solving and debugging abilities Strong team collaboration and communication skills Ability to work under pressure and meet deadlines Proactive learner with a willingness to adopt new technologies Job Types: Full-time, Permanent Pay: ₹60,000.00 - ₹90,000.00 per month Benefits: Flexible schedule Provident Fund Location Type: In-person Schedule: Fixed shift Experience: .NET: 5 years (Required) Location: Ahmedabad, Gujarat (Required) Work Location: In person Speak with the employer +91 7888499500
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Remote
Hiring for USA based Multinational Company (MNC) We are seeking a highly skilled and experienced Senior Power BI Developer to join our analytics team. The ideal candidate will have hands-on experience in designing, developing, and deploying Business Intelligence (BI) solutions using Power BI. This role involves working closely with business stakeholders to understand reporting requirements, transforming data into actionable insights, and leading data visualization initiatives to drive data-informed decision-making. Design, develop, and maintain advanced Power BI dashboards and reports that meet business requirements. Perform data modeling, DAX calculations, and Power Query transformations to build efficient data sets. Collaborate with data engineers and analysts to integrate Power BI with various data sources (SQL Server, Azure, Excel, SharePoint, etc.). Work with business stakeholders to gather requirements, create wireframes, and ensure accurate and insightful visualizations. Optimize performance of BI solutions through effective data model design and report tuning. Implement row-level security and manage user access in Power BI Service. Automate data refresh schedules and troubleshoot data issues. Maintain documentation for all BI processes, data flows, and reports. Stay updated on new Power BI features and industry trends to continually enhance reporting capabilities. Strong proficiency in Power BI Desktop, Power BI Service, and Power Query (M Language). Expertise in DAX (Data Analysis Expressions) for custom calculations and measures. Solid understanding of data warehousing concepts and data modeling (Star/Snowflake schemas). Experience with SQL Server, Azure Data Services (Azure SQL, Azure Data Lake, Synapse), or other cloud platforms. Ability to write complex SQL queries for data extraction and transformation. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information. Excellent communication and interpersonal skills. Familiarity with Agile methodologies and project management tools (e.g., JIRA, Azure DevOps) is a plus. Experience integrating Power BI with Power Automate, Power Apps, or other Microsoft 365 services is a plus.
Posted 1 month ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title : Senior Database Administrator (DBA) Location : Gachibowli, Hyderabad : 12+ Years Interview Mode : Face-to-Face (F2F) is Type : Full-Time / Permanent Work Mode : On-site Job Summary We are looking for a highly experienced Senior Database Administrator (DBA) with over 12 years of hands-on experience to manage mission-critical database systems. The ideal candidate will play a key role in ensuring optimal performance, security, availability, and reliability of our data infrastructure. Key Responsibilities Design, implement, and manage complex database architectures. Administer, monitor, and optimize large-scale databases (OLTP and OLAP systems). Ensure high availability, failover, replication, and disaster recovery strategies are in place and effective. Conduct performance tuning, query optimization, and capacity planning. Automate database maintenance tasks using scripting languages (Shell, Python, PowerShell, etc.). Lead major database upgrades, migrations, and patching with zero or minimal downtime. Collaborate with DevOps, application, and infrastructure teams to support integration and delivery pipelines. Implement strong database security practices, user roles, and access controls. Maintain up-to-date documentation of database environments and configurations. Mentor junior DBAs and participate in database architecture reviews. Required Skills & Experience 12+ years of solid experience as a Database Expertise in Oracle, MS SQL Server, MySQL, or PostgreSQL databases. Experience in managing cloud-based database platforms (AWS RDS, Azure SQL, or GCP Cloud SQL). Strong knowledge of backup and recovery, replication, and clustering technologies. Proficient in database monitoring tools (e.g., OEM, SolarWinds, Nagios, etc.). Hands-on experience with automation scripts for routine tasks and monitoring. Solid understanding of data privacy, compliance, and audit requirements. Excellent problem-solving and troubleshooting abilities. Strong verbal and written communication skills. Preferred Qualifications Professional certifications (e.g., Oracle OCP, Microsoft Certified DBA, AWS Database Specialty). Experience with NoSQL databases like MongoDB or Cassandra. Exposure to Agile/DevOps environments and CI/CD database automation. Additional Information Location : Gachibowli, Hyderabad (Candidates must be willing to work from office). Interview Mode : Face-to-face interviews are mandatoryno virtual interviews will be conducted. Joining Timeline : Immediate joiners preferred or within 15-30 days (ref:hirist.tech) Show more Show less
Posted 1 month ago
0 years
0 Lacs
Delhi, India
On-site
What Youll Do Architect and scale modern data infrastructure: ingestion, transformation, warehousing, and access Define and drive enterprise data strategygovernance, quality, security, and lifecycle management Design scalable data platforms that support both operational insights and ML/AI applications Translate complex business requirements into robust, modular data systems Lead cross-functional teams of engineers, analysts, and developers on large-scale data initiatives Evaluate and implement best-in-class tools for orchestration, warehousing, and metadata management Establish technical standards and best practices for data engineering at scale Spearhead integration efforts to unify data across legacy and modern platforms What You Bring Experience in data engineering, architecture, or backend systems Strong grasp of system design, distributed data platforms, and scalable infrastructure Deep hands-on experience with cloud platforms (AWS, Azure, or GCP) and tools like Redshift, BigQuery, Snowflake, S3, Lambda Expertise in data modeling (OLTP/OLAP), ETL pipelines, and data warehousing Experience with big data ecosystems: Kafka, Spark, Hive, Presto Solid understanding of data governance, security, and compliance frameworks Proven track record of technical leadership and mentoring Strong collaboration and communication skills to align tech with business Bachelors or Masters in Computer Science, Data Engineering, or a related field Nice To Have (Your Edge) Experience with real-time data streaming and event-driven architectures Exposure to MLOps and model deployment pipelines Familiarity with data DevOps and Infra as Code (Terraform, CloudFormation, CI/CD pipelines) (ref:hirist.tech) Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Greater Kolkata Area
Remote
Omni's team is passionate about Commerce and Digital Transformation. We've been successfully delivering Commerce solutions for clients across North America, Europe, Asia, and Australia. The team has experience executing and delivering projects in B2B and B2C solutions. Job Description This is a remote position. We are seeking a Senior Data Engineer to architect and build robust, scalable, and efficient data systems that power AI and Analytics solutions. You will design end-to-end data pipelines, optimize data storage, and ensure seamless data availability for machine learning and business analytics use cases. This role demands deep engineering excellence balancing performance, reliability, security, and cost to support real-world AI applications. Key Responsibilities Architect, design, and implement high-throughput ETL/ELT pipelines for batch and real-time data processing. Build cloud-native data platforms : data lakes, data warehouses, feature stores. Work with structured, semi-structured, and unstructured data at petabyte scale. Optimize data pipelines for latency, throughput, cost-efficiency, and fault tolerance. Implement data governance, lineage, quality checks, and metadata management. Collaborate closely with Data Scientists and ML Engineers to prepare data pipelines for model training and inference. Implement streaming data architectures using Kafka, Spark Streaming, or AWS Kinesis. Automate infrastructure deployment using Terraform, CloudFormation, or Kubernetes operators. Requirements 7+ years in Data Engineering, Big Data, or Cloud Data Platform roles. Strong proficiency in Python and SQL. Deep expertise in distributed data systems (Spark, Hive, Presto, Dask). Cloud-native engineering experience (AWS, GCP, Azure) : BigQuery, Redshift, EMR, Databricks, etc. Experience designing event-driven architectures and streaming systems (Kafka, Pub/Sub, Flink). Strong background in data modeling (star schema, OLAP cubes, graph databases). Proven experience with data security, encryption, compliance standards (e.g., GDPR, HIPAA). Preferred Skills Experience in MLOps enablement : creating feature stores, versioned datasets. Familiarity with real-time analytics platforms (Clickhouse, Apache Pinot). Exposure to data observability tools like Monte Carlo, Databand, or similar. Passionate about building high-scale, resilient, and secure data systems. Excited to support AI/ML innovation with state-of-the-art data infrastructure. Obsessed with automation, scalability, and best engineering practices. (ref:hirist.tech) Show more Show less
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Gurugram
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 month ago
2.0 - 7.0 years
4 - 9 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum of 2+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions.
Posted 1 month ago
2.0 - 7.0 years
4 - 9 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum of 2+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions.
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 month ago
5.0 years
10 - 15 Lacs
Gurgaon
On-site
Job Role: ETL SSIS+SSAS Job Location: Gurugram Interview: 1 Internal technical Round|| PM Round ||Client Interview (Candidate should be ready for F2F) Experience Range- 5-8 Year Job Description: · Microsoft SQL · Microsoft SSIS, SSAS · Data warehouse/Data Migration · Experience in Analytics / OLAP Cube Development (Microsoft SSAS and MDX). · Analyze, design, build, query, troubleshoot and maintain cubes. · Knowledge/expertise in SSAS Tabular with DAX language, Proficient in MDX and DAX · Strong conceptual knowledge of ETL fundamentals · Exposure to the following will be beneficial. Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,500,000.00 per year Schedule: Day shift Work Location: In person
Posted 1 month ago
6.0 - 11.0 years
30 - 45 Lacs
Mumbai
Work from Office
• Maintain modular architecture client (Qt, Flutter), mobile app, microservice & API gateway • Architect system integrating WebSocket’s, Kafka & Pinot/Superset • Guide cross-functional team • Integration of BFF layers, API & cross platform deployment Required Candidate profile • Golang, C++, or other languages • React, Flutter, & client-side architectures, REST, WebSocket, GraphQL, Kafka & APIs, Cloud-native design, distributed databases, OLAP tools, CI/CD pipelines, IaC
Posted 1 month ago
6.0 - 11.0 years
30 - 45 Lacs
Mumbai
Work from Office
• Design & develop backend systems using Golang, microservices, BFF architecture & RESTful APIs • Use C++ to enhance functionality • Database Management, third-party integrations, Containerization & Orchestration, rule engine, API Gateway Management Required Candidate profile • Exp. with in-memory database & SQL/NoSQL database, OLAP systems (Apache Kafka), integrate third-party APIs, containerization, observability & monitoring tools, rules engines, Version Control & CI/CD
Posted 1 month ago
0.6 - 1.6 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description Work collaboratively with Data Analyst, Data Scientists Software Engineers and cross-functional partners to design and deploy data pipelines to deliver analytical solution. Responsible for building data pipelines, data model, data marts, data warehouse including OLAP cube in multidimensional data model with proficiency / conceptual understanding of PySpark and SQL scripting. Responsible for the design, development, testing, implementation and support functional semantic data marts using various modeling techniques from underlying data stores/data warehouse and facilitate Business Intelligence Data Solutions Experience in building reports, dashboards, scorecards & visualization using Tableau/ Power BI and other data analysis techniques to collect, explore, and extract insights from structured and unstructured data. Responsible for AI/ML model Utilizing machine learning, statistical methods, data mining, forecasting and predictive modeling techniques. Following Dev Ops Model, Agile implementation, CICD method of deployment & JIRA creation / management for projects. Define and build technical/data documentation and experience with code version control systems (for e.g., git). Assist owner with periodic evaluation of next generation & modernization of platform. Exhibit Leadership Principles such as Accountability & Ownership of High Standards: Given the criticality & sensitivity of data . Customer Focus : Going Above & Beyond in finding innovative solution and product to best serve the business needs and there-by Visa. This is a hybrid position. Expectation of days in office will be confirmed by your hiring manager. Qualifications Basic Qualifications • Bachelors degree or •0.6-1.6 years of work experience with a Bachelor’s Degree or Master's Degree in computer / information science with relevant work experience in IT industry •Enthusiastic, energetic and self-learning candidates with loads of curiosity and flexibility. •Proven hands-on capability in the development of data pipelines and data engineering. •Experience in creating data-driven business solutions and solving data problems using technologies such as Hadoop, Hive, and Spark. •Ability to program in one or more scripting languages such as Python and one or more programming languages such as Java or Scala. •Familiarity with AI-centric libraries like TensorFlow, PyTorch, and Keras. •Familiarity with machine learning algorithms and statistical models is beneficial. •Critical ability to interpret complex data and provide actionable insights. This encompasses statistical analysis, predictive modeling, and data visualization. •Extended experience in Agile Release Management practices, governance, and planning. •Strong leadership skills with demonstrated ability to lead global, cross-functional teams. Additional Information Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
At Aramya, we’re redefining fashion for India’s underserved Gen X/Y women, offering size-inclusive, comfortable, and stylish ethnic wear at affordable prices. Launched in 2024, we’ve already achieved ₹40 Cr in revenue in our first year, driven by a unique blend of data-driven design, in-house manufacturing, and a proprietary supply chain. Today, with an ARR of ₹100 Cr, we’re scaling rapidly with ambitious growth plans for the future. Our vision is bold to build the most loved fashion and lifestyle brands across the world while empowering individuals to express themselves effortlessly. Backed by marquee investors like Accel and Z47, we’re on a mission to make high-quality ethnic wear accessible to every woman. We’ve built a community of loyal customers who love our weekly design launches, impeccable quality, and value-for-money offerings. With a fast-moving team driven by creativity, technology, and customer obsession, Aramya is more than a fashion brand—it’s a movement to celebrate every woman’s unique journey. We’re looking for a passionate Data Engineer with a strong foundation. The ideal candidate should have a solid understanding of D2C or e-commerce platforms and be able to work across the stack to build high-performing, user-centric digital experiences. Key Responsibilities Design, build, and maintain scalable ETL/ELT pipelines using tools like Apache Airflow, Databricks , and Spark . Own and manage data lakes/warehouses on AWS Redshift (or Snowflake/BigQuery). Optimize SQL queries and data models for analytics, performance, and reliability. Develop and maintain backend APIs using Python (FastAPI/Django/Flask) or Node.js . Integrate external data sources (APIs, SFTP, third-party connectors) and ensure data quality & validation. Implement monitoring, logging, and alerting for data pipeline health. Collaborate with stakeholders to gather requirements and define data contracts. Maintain infrastructure-as-code (Terraform/CDK) for data workflows and services. Must-Have Skills Strong in SQL and data modeling (OLTP and OLAP). Solid programming experience in Python , preferably for both ETL and backend. Hands-on experience with Databricks , Redshift , or Spark . Experience with building and managing ETL pipelines using tools like Airflow , dbt , or similar. Deep understanding of REST APIs , microservices architecture, and backend design patterns. Familiarity with Docker , Git, CI/CD pipelines. Good grasp of cloud platforms (preferably AWS ) and services like S3, Lambda, ECS/Fargate, CloudWatch. Nice-to-Have Skills Exposure to streaming platforms like Kafka, Kinesis, or Flink. Experience with Snowflake , BigQuery , or Delta Lake . Proficient in data governance , security best practices, and PII handling. Familiarity with GraphQL , gRPC , or event-driven systems. Knowledge of data observability tools (Monte Carlo, Great Expectations, Datafold). Experience working in a D2C/eCommerce or analytics-heavy product environment. Show more Show less
Posted 1 month ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Total 6+ years of experience 3 years of experience with leading automation tools and White-box testing (Java APIs) i.e. JUnit 2 years of Software development in Java 2EE. Experience in other automation tools i.e. Selenium, Mercury tools or self created test-harness tool 4 Year College Degree in Computer Science or related field i.e. BE or MCA Good understanding of XML, XSL/XSLT, RDBMS and Unix platforms. Experience in Multi-dimensional (OLAP technology),Data Warehouse and Financial software would be desirable. Motivated individual in learning leading-edge technology and testing complex software Career Level - IC3 Responsibilities Total 6+ years of experience 3 years of experience with leading automation tools and White-box testing (Java APIs) i.e. JUnit 2 years of Software development in Java 2EE. Experience in other automation tools i.e. Selenium, Mercury tools or self created test-harness tool 4 Year College Degree in Computer Science or related field i.e. BE or MCA Good understanding of XML, XSL/XSLT, RDBMS and Unix platforms. Experience in Multi-dimensional (OLAP technology),Data Warehouse and Financial software would be desirable. Motivated individual in learning leading-edge technology and testing complex software About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 month ago
7.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description: We are seeking an experienced Engineer with strong expertise in PostgreSQL, PL/SQL programming, and cloud-based data migration. The ideal candidate will have hands-on experience in migrating and tuning databases, particularly from Oracle to PostgreSQL on GCP (AlloyDB / Cloud SQL), and be skilled in modern data architecture and cloud services. Locations - Indore/Bengaluru/Noida Key Responsibilities Design, build, test, and maintain scalable data architectures on GCP. Lead Oracle to PostgreSQL data migration initiatives (preferably AlloyDB / Cloud SQL). Optimize PostgreSQL performance (e.g., tuning autovacuum, stored procedures). Translate Oracle PL/SQL code to PostgreSQL equivalents. Integrate hybrid data storage using GCP services (BigQuery, Firestore, MemoryStore, Spanner). Implement database job scheduling, disaster recovery, and logging. Work with GCP Dataflow, MongoDB, and data migration services. Mentor and lead database engineering teams. Required Technical Skills Advanced PostgreSQL & PL/SQL programming (queries, procedures, functions). Strong experience with database migration (Oracle ➝ PostgreSQL on GCP). Proficient in Cloud SQL, AlloyDB, and performance tuning. Hands-on experience with BigQuery, Firestore, Spanner, MemoryStore, MongoDB, Cloud Dataflow. Understanding of OLTP and OLAP systems. Desirable Qualifications GCP Database Engineer Certification Exposure to Enterprise Architecture, Project Delivery, and Performance Benchmarking Strong analytical, problem-solving, and leadership skills. Years Of Experience- 7 to 10 Years Education/Qualification- BE / B.Tech / MCA / M.Tech / M.Com Interested candidates can directly share their resume at anubhav.pathania@impetus.com Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Experience- 5 To 9 Year Location- Pune Job Type- Contract For Client Responsibilities Job Description- Development of high-quality database solutions Develop, implement, and optimize stored procedures and functions using SQL Review and interpret ongoing business report requirements Analyze existing SQL queries for performance improvements Gather user requirements and identify new features Provide data management support to users Ensure all database programs meet company and performance requirements Build appropriate and useful reporting deliverables Suggest new queries Provide timely scheduled management reporting Investigate exceptions about asset movements Mentor junior team members as needed Work with data architects to ensure that solutions are aligned with company-wide technology directions Required Skills Technical Bachelor’s degree in IT, Computer science, or related field 5+ years of experience as a SQL Developer or similar role Strong proficiency with SQL and its variations among popular databases (Snowflake) Strong skills in performance tuning of complex SQL queries, procedure and indexing strategies Experience in designing, OLAP databases using data warehouse patterns and schema’s including facts, dimensions, sorting keys, indexes, constraints etc. Query design and performance tuning of complex queries for very large data sets Knowledge of best practices when dealing with relational databases Capable of troubleshooting common database issues Translating functional and technical requirements into detailed design Data Analysis experience, for example – mapping the source to target rules and fields Click here to apply Apply here Job Category: SQL Developer Job Type: Contract Job Location: Pune Apply for this position Full Name * Email * Phone * Cover Letter * Upload CV/Resume *Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. * Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France