Jobs
Interviews

954 Olap Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Greetings from Tata Consultancy Services Join the Walk-in Drive on 21st June 2025 and Pave your path to value with TCS AI Cloud Team We are Hiring for Below Skills Exp : 4 yrs to 12 yrs Azure Data Engineer Required:Implementation, and operations of OLTP, OLAP, DW technologies such as Azure SQL, Azure SQL DW Show more Show less

Posted 1 month ago

Apply

0.0 - 20.0 years

0 Lacs

Bengaluru, Karnataka

On-site

- 3+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with SQL Over the past 20 years Amazon has earned the trust of over 300 million customers worldwide by providing unprecedented convenience, selection and value on Amazon.com. By deploying Amazon Pay’s products and services, merchants make it easy for these millions of customers to safely purchase from their third party sites using the information already stored in their Amazon account. In this role, you will lead Data Engineering efforts to drive automation for Amazon Pay organization. You will be part of the data engineering team that will envision, build and deliver high-performance, and fault-tolerant data pipeliens. As a Data Engineer, you will be working with cross-functional partners from Science, Product, SDEs, Operations and leadership to translate raw data into actionable insights for stakeholders, empowering them to make data-driven decisions. Key job responsibilities · Design, implement, and support a platform providing ad-hoc access to large data sets · Interface with other technology teams to extract, transform, and load data from a wide variety of data sources · Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, Redshift, and OLAP technologies · Model data and metadata for ad-hoc and pre-built reporting · Interface with business customers, gathering requirements and delivering complete reporting solutions · Build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark. · Build and deliver high quality data sets to support business analyst, data scientists, and customer reporting needs. · Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 month ago

Apply

3.0 - 7.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Job Description: Position Overview: We are seeking a skilled FLEXCUBE Reports Developer with expertise in Qlik sense to join our team. The ideal candidate will be responsible for designing, developing, and maintaining reports and dashboards that provide valuable insights from FLEXCUBE core banking data. Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. 3 to 7 Years of proven experience in developing reports and dashboards using Qlik Sense. Familiarity with FLEXCUBE core banking systems. Familiarity with OLAP Cubes, Data Marts, Datawarehouse Proficiency in data modelling and data visualization concepts. Strong SQL skills for data extraction and transformation. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Banking or financial industry experience is beneficial. Qlik Sense certifications are a plus. Additional Information: This role offers an opportunity to work with cutting-edge reporting and analytics tools in the banking sector. The candidate should be prepared to work closely with business stakeholders and contribute to data-driven decision-making. Candidates with a strong background in FLEXCUBE reports development and Qlik Sense are encouraged to apply. We are committed to providing a collaborative and growth-oriented work environment. Career Level - IC2 An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, interview process, and in potential roles. to perform crucial job functions.

Posted 1 month ago

Apply

3.0 years

0 Lacs

India

On-site

Who Are We Third Chair(YC X25) is building AI Agents for in-house legal teams. The team comprises of 2 second-time founders with past exits. Yoav previously cofounded social media analytics startup Trendpop(YC W21) that scaled to 1M in ARR in 16 months building a platform that processed millions of social posts per day. Shourya previously cofounded a consumer finance startup Fello(YC W22) that scaled to over 2 million users and managed over 600k monthly active users and over $250,000 in monthly investments. Third Chair is building vertical AI-native workflows for legal teams that help them complete end-to-end workflows that previously required 100s of hours. This is accomplished by building SOTA AI agents that browse the web, download and collect evidence, draft letters and more. We've grown 88% last month and went from 0 to 100k ARR in 5 months . More here. What Makes You a Good Fit 3+ years of hands-on experience developing production level Node.js/Typescript backends. Strong experience with structured DBMSs like PostgreSQL and OLAP databases like Redshift. Strong understanding of AWS services such as ECS, RDS, S3, CloudWatch, Elasticache. Experience working with telemetry, CI/CD and IaaC pipelines. Comfortable with US timezones. What Makes You a Great Fit Past experience with OpenAI APIs for completions, function calling and building context aware assistants. Past experience with Go routines. Building multi-agent systems using frameworks like CrewAI or such. Strong sense of cost optimization strategies, system design, and building efficient API stacks. Benefits Work from anywhere - We're a distributed team across multiple timezones with a focus on outputs instead of location or working hours. Generous PTO policy. Competitive pay bracket. Equity at a fast growing YC backed co in a disruptive market. Show more Show less

Posted 1 month ago

Apply

12.0 years

1 - 6 Lacs

Hyderābād

On-site

The Windows Data Team is responsible for developing and operating one of the world’s largest data eco-systems: PiB data is being processed, stored and accessed every day. In addition to Azure, Fabric, and Microsoft offerings, the team also utilizes modern open-source technologies such as Spark, Starrocks, and ClickHouse. Thousands of developers in Windows, Bing, Ads, Edge, MSN, etc. are working on top of the data products that the team builds. We’re looking for passionate engineers to join us for the mission of powering Microsoft businesses through data substrate and infusing our data capabilities into the industry. We are looking for a Principal Software Engineering Manager who can lead a team to design, develop, and maintain data pipelines and applications using Spark, SQL, map-reduce, and other technologies on our big data platforms. You will work with a team of data scientists, analysts, and engineers to deliver high-quality data solutions that support our business goals and customer needs. You will also collaborate with other teams across the organization to ensure data quality, security, and compliance. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Lead a team of software developers to develop and optimize data pipelines and applications using Spark, Cosmos, Azure, SQL, and other frameworks. Implement data ingestion, transformation, and processing logic using various data sources and formats. Perform data quality checks, testing, and debugging to ensure data accuracy and reliability. Document and maintain data pipeline specifications, code, and best practices. Research and evaluate new data technologies and tools to improve data performance and scalability. Work with world-class engineer/scientist team on Big Data, Analytics and OLAP/OLTP. Embrace both Microsoft technology and cutting-edge open-source technology. Qualifications Required Qualifications: Bachelor's Degree in Computer Science or related technical field AND 12+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR Master's Degree in Computer Science or related technical field AND 10+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. 4+ years people management experience. Demonstrate working knowledge of cloud and distributed computing platforms such as Azure or AWS. Strong knowledge and experience with Map Reduce, Spark, Kafka, Synapse, Fabric, or other data processing frameworks. Fluent in English, both written and spoken. Preferred Qualifications: Experience with CosmosDB or other NoSQL databases is a plus. Experience in data engineering, data analysis, or data related fields. Experience with data science and ML tools such as Scikit-learn, R, Azure AI, Pyspark, or similar. Experience with data modeling, data warehousing, and ETL techniques. Experience in designing, developing, and shipping services with secure continuous integration and continuous delivery practices (CI/CD). Relational and/or non-relational (NoSQL) databases. C/C++ and lower-level languages are a plus. #W+Djobs #WindowsIndia #WDXIndia Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 month ago

Apply

3.0 years

1 - 4 Lacs

Bengaluru

On-site

Job Description: Position Overview: We are seeking a skilled FLEXCUBE Reports Developer with expertise in Qlik sense to join our team. The ideal candidate will be responsible for designing, developing, and maintaining reports and dashboards that provide valuable insights from FLEXCUBE core banking data. Key Responsibilities: Report Development: Design and create interactive reports and dashboards using Qlik Sense to visualize FLEXCUBE data for business users. FLEXCUBE 14.7 Backend Tables: FLEXCUBE data model knowlege is must Data Modelling: Develop data models and relationships within Qlik Sense to ensure accurate representation of FLEXCUBE data. Customization: Customize reports to meet specific business requirements and ensure they align with industry best practices. Performance Optimization: Optimize report performance for efficient data retrieval and rendering. Data Integration: Integrate data from various sources into Qlik Sense reports, including FLEXCUBE and other data repositories. Data Security: Implement data security and access controls within Qlik Sense to protect sensitive information. User Training: Provide training and support to end-users to enable them to effectively utilize Qlik Sense reports. Documentation: Maintain documentation for reports, data models, and best practices. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. 3 to 7 Years of proven experience in developing reports and dashboards using Qlik Sense. Familiarity with FLEXCUBE core banking systems. Familiarity with OLAP Cubes, Data Marts, Datawarehouse Proficiency in data modelling and data visualization concepts. Strong SQL skills for data extraction and transformation. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Banking or financial industry experience is beneficial. Qlik Sense certifications are a plus. Additional Information: This role offers an opportunity to work with cutting-edge reporting and analytics tools in the banking sector. The candidate should be prepared to work closely with business stakeholders and contribute to data-driven decision-making. Candidates with a strong background in FLEXCUBE reports development and Qlik Sense are encouraged to apply. We are committed to providing a collaborative and growth-oriented work environment. Career Level - IC2 Diversity and Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, interview process, and in potential roles. to perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before.

Posted 1 month ago

Apply

0 years

5 - 8 Lacs

Bengaluru

On-site

Position Summary BI Developer will possess advanced knowledge of SQL Server, SSIS, SSRS, T-SQL and some data modelling experience. This role will provide outstanding analytical and problem solving expertise with a good grasp of technical side of Business Intelligence PRINCIPLE JOB RESPONSIBILITIES: Logical & physical Design of the Database, Creation of tables, views, procedures, packages, triggers and other database objects. Generating scripts of database objects. Writing and debug complex queries. Performance tuning and SQL tuning. Product development knowledge, having familiarity with issues related to database design, schema changes. Good experience in Reporting and Dashboarding, especially using SSRS Experience in Design and development of Stored Procedures and Views necessary to support SSRS reports Experience in creating complex SSIS packages with translation handling and logging is a plus Experience in OLAP, especially using SSAS, is a plus Education B Tech / BE/ M Tech/ MCA/ BSc/ MSc from reputed University Skills BC - Dependability and Reliability BC - Initiative BC - Time Management DC - US Healthcare domain Knowledge FC - Client Focus FC - Oral Communication FC - Written Communication PC - Jiva Product Knowledge Competencies BC - Collaboration & Interpersonal Skills BC - Time Management FC - Analytical Skills FC - Communication Skills FC - Quality TC - Relational Database - SQL Server TC - SQL Server Analysis Service (SSAS) - Tabular TC - SQL Server Integration Service (SSIS) TC - SQL Server Reporting Service (SSRS)

Posted 1 month ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Summary- Effectively facilitate meetings and brainstorming sessions with business as well as the technical team. Require strategic design and mapping of business requirements & solutions to system/technical requirements. Examine functional product requirements and breakdown the requirements into details technical stories and tasks Construct Use case diagrams and workflow charts, to help clarify and elaborate upon technical requirements. Identify and engage all key stakeholders, contributors, business, and technical resources required for product updates and ensure contributors are motivated to complete tasks within the parameters of the requirements Work with the entire team and customers to resolve any conflicts or confusion related to requirements or desired functionality Responsibilities - Effectively collaborate with Technical and Non-Technical team members and customers. Oversees and take ownership for the successful completion of the assigned project. Lead the ERP level project development efforts with minimal direction from the director or manager. Effectively facilitate meetings, brainstorming sessions to build consensus within customers representatives and in the technical team (development + QA) Create detailed documentation covering use cases and business requirements. Scrum planning Reporting to management and obtaining approval before taking any key project decision. Provide guidance to technical teams regarding functional requirements. Ensure & validate that delivered functionality meets customer’s expectation. Coordinate UAT efforts. Demo the released features/application to customers. Key Skills: (must have) 3+ years’ work experience in end to end systems development process. Excellent verbal and written communication skills. Must have good listening skills. Knowledge and experience in using Postman tool for APIs Extensive knowledge of relevant technology concepts (e.g. client-server, relational databases, cloud-based and web-based architectures) Basic competence in at least one programming language (e.g. C#, node.js, JavaScript, or PHP etc.) Basic understanding of the implementation of ERP or CRM systems. Ability to quickly assimilate and apply business models and technologies. Team player with strong interpersonal skills and the ability to lead the team when required. Proactive risk analysis in the project and providing steps to customer/internal dev team to mitigate the risk. Must have an extensive working knowledge of Business Intelligence concepts (e.g. reporting, querying software, OLAP, spreadsheets, dashboards, and data mining) Knowledge and experience with service/API patterns including protocols and formats such as SOAP, REST, XML, and SWAGGER. Strong communication skills, including prioritizing, problem-solving and interpersonal relationship building Extensive Experience in technical business analysis. Advanced knowledge of programming languages like SQL and system integration solutions Proven time management cum organization skills (must be able to prioritize workload and meet deadlines). Must have excellent ideas presentation skills through PPT or Word. Addon Skills Experience in BFSI (Banking, Finance & Insurance) domain. Knowledge of Information Security/Identity management Understanding of Testing methodology and processes. Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Exp : 8yrs to 12yrs Job Location : Chennai (WFO) Industry : IT Service & Consulting Functional Area : IT Software – Application Programming, Maintenance Role Category : Development Employment Type : Full Time / Permanent Key Skills : Power BI, Dax Queries Job Description Roles and Responsibilities: Min 8+ years of experience as a Power BI Developer Hand on experience in team & Client handling Expert knowledge using advanced calculations in MS Power BI Desktop, DAX languages like Aggregate, Date, Logical, String, Table etc., Prior experience in connecting Power BI with on-premise and cloud computing platforms A deep understanding of, and ability to use and explain all aspects of, relational database design, multidimensional database design, OLTP, OLAP, KPIs, Scorecards, and Dashboards Very good understanding of DM Techniques for Analytical Data (ie. Facts, Dimensions, Measures) Experience in background in data warehouse design (e.g. dimensional modelling) and data mining Hands on experience in SSIS, SSRS, SSAS is a plus Show more Show less

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 15 Lacs

Kolkata

Work from Office

Data Engineer || Product Based MNC (Direct Payroll) || Kolkata Location Role & responsibilities : Understands requirements and is involved in the discussions relating to technical and functional design of the sprint/ module/project Design and implement end-to-end data solutions (storage, integration, processing, and visualization) in Azure. Used various sources to ingest data into Azure Data Factory ,Azure Data Lake Storage (ADLS) such as SQL Server, Excel, Oracle, SQL Azure etc. Extract data from one database and load it into another Build data architecture for ingestion, processing, and surfacing of data for large-scale applications Use many different scripting languages, understanding the nuances and benefits of each, to combine systems Research and discover new methods to acquire data, and new applications for existing data Work with other members of the data team, including data architects, data analysts, and data scientists Prepare data sets for analysis and interpretation Perform statistical analysis and fine-tuning using test results Create libraries and extend the existing frameworks Create design documents basis discussions and assists in providing technical solutions for the business process Preferred candidate profile In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework 3+ years of overall experience with Azure, Data Factory and .Net Strong in Data factory and should be able to create manual and auto trigger pipelines Should be able to create, update, edit and delete ETL jobs in Azure Synapse Analytics Recreate existing application logic and functionality in the Azure Data Lake, Data Factory, SQL Database and SQL data warehouse environment. Knowledge of SQL queries, SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS) Proven abilities to take initiative and be innovative Analytical mind with a problem-solving aptitude 10 LPA - 15 LPA Apply for this position Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. * Your Next Step Towards Success Starts Here Why Choose Us Free Expert Consultation Have an idea but unsure how to execute it? Our industry experts offer free feasibility checks, expert advice, and actionable strategies tailored to your goals at no cost! Complimentary Technical Project Manager Every project comes with a Complimentary Technical Project Manager to ensure smooth project management, offer valuable development guidance and keep everything on track.

Posted 1 month ago

Apply

4.0 - 6.0 years

1 - 5 Lacs

Pune

Work from Office

4 - 6 years of experience as a Python Developer with a strong understanding of Python programming concepts and best practice Bachelor s Degree/B.Tech/B.E in Computer Science or a related discipline Design, develop, and maintain robust and scalable Python-based applications, tools, and frameworks that integrate machine learning models and algorithms Demonstrated expertise in developing machine learning solutions, including feature selection, model training, and evaluation Proficiency in data manipulation libraries (e.g., Pandas, NumPy) and machine learning frameworks (e.g., Scikit-learn, TensorFlow, PyTorch, Keras) Experience with web frameworks like Django or Flask Contribute to the architecture and design of data-driven solutions, ensuring they meet both functional and non-functional requirements Experience with databases such as MS-SQL Server, PostgreSQL or MySQL. Solid knowledge of OLTP and OLAP concepts Experience with CI/CD tooling (at least Git and Jenkins) Experience with the Agile/Scrum/Kanban way of working Self-motivated and hard-working Knowledge of performance testing frameworks including Mocha and Jest. Knowledge of RESTful APIs. Understanding of AWS and Azure Cloud services Experience with chatbot and NLU / NLP based application is required Qualifications Bachelor s Degree/B.Tech/B.E in Computer Science or a related discipline

Posted 1 month ago

Apply

2.0 - 3.0 years

6 - 10 Lacs

Vadodara

Work from Office

Job Description When looking to buy a product, whether it is in a brick and mortar store or online, it can be hard enough to find one that not only has the characteristics you are looking for but is also at a price that you are willing to pay. It can also be especially frustrating when you finally find one, but it is out of stock. Likewise, brands and retailers can have a difficult time getting the visibility they need to ensure you have the most seamless experience as possible in selecting their product. We at Wiser believe that shoppers should have this seamless experience, and we want to do that by providing the brands and retailers the visibility they need to make that belief a reality. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze lots of structured and semi-structured data from lots of different places every day (whether it s 20 million+ products from 500+ websites or data collected from over 300,000 brick and mortar stores across the country). We help our customers be more competitive by discovering interesting patterns in this data they can use to their advantage, while being uniquely positioned to be able to do this across both online and instore. We are looking for a lead-level software engineer to lead the charge on a team of like-minded individuals responsible for developing the data architecture that powers our data collection process and analytics platform. If you have a passion for optimization, scaling, and integration challenges, this may be the role for you. What You Will Do Think like our customers - you will work with product and engineering leaders to define data solutions that support customers business practices. Design/develop/extend our data pipeline services and architecture to implement your solutions - you will be collaborating on some of the most important and complex parts of our system that form the foundation for the business value our organization provides Foster team growth - provide mentorship to both junior team members and evangelizing expertise to those on others. Improve the quality of our solutions - help to build enduring trust within our organization and amongst our customers by ensuring high quality standards of the data we manage Own your work - you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table - some of our best innovations originate within the team Technologies We Use Languages: SQL, Python Infrastructure: AWS, Docker, Kubernetes, Apache Airflow, Apache Spark, Apache Kafka, Terraform Databases: Snowflake, Trino/Starburst, Redshift, MongoDB, Postgres, MySQL Others: Tableau (as a business intelligence solution) Qualifications Bachelors/Master s degree in Computer Science or relevant technical degree 10+ years of professional software engineering experience Strong proficiency with data languages such as Python a

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Introduction A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. In this role, you will work for IBM BPO, part of Consulting that, accelerates digital transformation using agile methodologies, process mining, and AI-powered workflows. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Preferred Education Master's Degree Required Technical And Professional Expertise Minimum of 5+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred Technical And Professional Experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions. Show more Show less

Posted 1 month ago

Apply

0 years

5 Lacs

Hyderābād

On-site

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Co-ordinations with stakeholders to ensure timely deliverables. Provide solution architecture support to projects where required ensuring that solution defined meets business needs, is aligned to functional and target architecture with any deviations approved. Analyze and propose plan to demise legacy systems. Lead a team of data engineers and assume responsibilities as Technical Lead for the assigned projects. Ensure full ownership and efficient management of the GDT IT services and products. Ensure that any new technology products are taken through the technology design governance process. Mentor and coach less experienced members of staff and promotes an understanding of the value of architecture and of use of technologies and standards in their domain across IT. Periodical monitor of team progress. Delivering optimum solution that meets client requirements. Inputs provided for Estimations, Monitoring & Co-ordinate team related activities. Involved in Designing, Development & Unit testing, Performance Testing the application Requirements To be successful in this role, you should meet the following requirements: Deep knowledge of Cloud Architecture on GCP, AWS or Azure with preference to GCP Proficient in Google Cloud and Cloud Native technologies Ability to work with senior stakeholders and various business parties and drive all the business discussions. A track record of making complex business decisions with authority, even in times of ambiguity, considering the potential long term risks and implications. DevOps and automation design experience Experience with ETL tools like IBM DataStage Experience in Hadoop / Hive Experience in Solution Architecture. Relevant experience in Big query, Google analytics, Data Flow, Pub/Sub, Cloud Sql, Qliksense, Spark Experience with Design, build and configure applications to meet business process requirements. Experience with On-Prem to GCP data migration projects Concepts RDBMS, SDLC, OLAP, Logical/Physical Dimensional Programming Languages PL-SQL, SQL , Unix Scripting, Java, Python Operating Systems Windows 2000/NT, Unix, Linux" Must have experience working in Agile environment and should be well versed with Agile/Scrum Master Experience working in building business intelligence solutions You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI

Posted 1 month ago

Apply

4.0 - 9.0 years

20 - 30 Lacs

Pune, Bengaluru

Hybrid

Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 4-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only

Posted 1 month ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Position Overview We are seeking a highly motivated and experienced ETL Tester to join our team and play a crucial role in ensuring the quality and reliability of our data platform and pipelines. You will be responsible for leading the development and implementation of an automated testing framework specifically designed for validating the Extract, Transform, Load (ETL) processes. Responsibilities Design and execute test cases for ETL processes and data integration Validate the accuracy and completeness of data being loaded into the target systems Develop SQL to validate the data such as checking duplicates, null values, truncated values and ensuring correct data aggregations. Run the Jobs using IBM DataStage for ETL Process. Execute test cases through Zephyr. Validate data in Target database according to mapping and business rules. Identify, isolate, and report defects and issues in the ETL processes Develop and maintain test data for ETL testing Run GITHUB commands for Automation. Validate data in OBIEE dashboards and reports against Database. Identify data quality issues in Source data Collaborate with the development team to resolve defects and improve the ETL processes Participate in the design and implementation of automated testing tools and frameworks Actively participate in Status reporting and Agile meetings. Document test results and communicate with stakeholders on the status of ETL testing. Track and report the defects in applications like JIRA and Zephyr. Qualifications Required Skills: Good understanding of Healthcare Domain. Good knowledge of SDLC and STLC with specific expertise of Software Testing. Strong understanding of ETL processes and data warehousing Experience working with large data sets and writing complex SQL queries Testing experiences in DB systems like Oracle, SQL server Experience in Toad for Oracle and DB2 Applications Experience in running ETL jobs, monitoring and debugging. Experience with test case design and execution Understanding of data models, data mapping documents, ETL design, and ETL coding Experience within Agile development environment (Sprint planning, demos and retrospectives, and other sprint ceremonies). Understanding of BI concepts - OLAP vs OLTP Experience in OBIEE reporting Broad knowledge of automated testing and modelling tools. Knowledge of building automation and deployment tools such as Jenkins. Flexible with timings. Excellent analytical skills and innovative problem-solving ability. Possess good communication skills and a very good team player. Required Experience & Education Bachelor’s degree in computer science or related field or higher with minimum 3 years of relevant experience 3- 5 Yrs. experience in Software QA & ETL Testing ETL testers typically work closely with ETL developers, business analysts, and data engineers to understand the ETL requirements, design specifications, and data mappings. They use a combination of manual testing techniques and automated testing tools to perform. ETL testing effectively. Strong SQL skills, data analysis abilities, and a good understanding of data integration concepts are essential for an ETL tester to be successful in their role. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives. Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Greetings from TATA Consultancy Services!! TCS is hiring for Data Modeler - Architect Experience Range: 10+ Years Job Location: Hyderabad (Adibatla), Chennai Job Summary: Detail-oriented and analytical Data Modeler to design, implement, and maintain logical and physical data models that support business intelligence, data warehousing, and enterprise data integration needs. The ideal candidate will work closely with business analysts, data architects, and software engineers to ensure data is organized effectively and support scalable, high-performance applications. Required Skills: • Strong understanding of relational, dimensional, and NoSQL data modeling techniques. • Proficient in data modeling tools (e.g., Erwin, Enterprise Architect, PowerDesigner, SQL Developer Data Modeler). • Experience with Advanced SQL and major database platforms (e.g., Oracle, SQL Server, PostgreSQL, MySQL). • Familiarity with cloud data platforms (e.g., AWS Redshift, Google BigQuery, Azure SQL, Snowflake). • Excellent communication and documentation skills. • Knowledge of data governance and data quality principles. • Experience with data warehousing concepts and tools (e.g., ETL pipelines, OLAP cubes). • Familiarity with industry standards such as CDM (Common Data Model), FHIR, or other domain-specific models Key Responsibilities: • Design and develop conceptual, logical, and physical data models. • Translate business requirements into data structures that support analytics, reporting, and operational needs. • Work with stakeholders to understand and document data needs and flows. • Optimize and maintain existing data models for performance and scalability. • Ensure data models are consistent with architectural guidelines and standards. • Develop and maintain metadata repositories and data dictionaries. • Collaborate with data architects and engineers to implement models within databases and data platforms. • Assist in data quality analysis and improvement initiatives. • Document data models and data mapping specifications. Regards Bodhisatwa Ray Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Hello ! You've landed on this page, which means you're interested in working with us. Let's take a sneak peek at what it's like to work at Innovaccer. Engineering at Innovaccer With every line of code, we accelerate our customers' success, turning complex challenges into innovative solutions. Collaboratively, we transform each data point we gather into valuable insights for our customers. Join us and be part of a team that's turning dreams of better healthcare into reality, one line of code at a time. Together, we’re shaping the future and making a meaningful impact on the world. About the Role We at Innovaccer are looking for a Senior Staff Engineer (Applications) to build the most amazing product experience. You’ll get to work with other engineers to build delightful feature experiences to understand and solve our customer’s pain points.We are looking for a Staff Engineer with hands-on experience in building applications and services using Python. A Day in the Life You are able to develop and drive a high-level strategy, as well as to take a hands-on approach to implementing that strategy Building efficient and reusable applications and abstractions Identify and communicate back-end best practices Participate in the project life-cycle from pitch/prototyping through definition and design to build, integration, QA and delivery Analyze and improve the performance, scalability, stability, and security of the product Improve engineering standards, tooling, and processes Provide technical leadership and direction for software development projects. Define the architectural vision and strategy in alignment with business goals and mentor and guide development teams to ensure the successful implementation of solutions. Design and architect scalable, robust, and high-performance software solutions using Python and React and conduct architectural reviews and ensure compliance with technical standards. Conducting code reviews, mentoring junior engineers, and staying updated with emerging technologies What you will do 8+ years of experience with a start-up mentality and high willingness to learn Expert in Python and experience with any web framework (Django, FastAPI, Flask etc) UI/web development experience will be a plus. Application architecture and solution design. Worked on large scale B2C/ B2B SaaS. Aggressive problem diagnosis and creative problem-solving skill Expert in Kubernetes and containerization Experience in RDBMS & NoSQL database such as Postgres, MongoDB, (any OLAP database is good to have) Strong experience in cloud computing platforms like Amazon Web Services and Azure Bachelor’s degree in Information Technology/Computer Science/Computer Engineering Applied AI experience We offer competitive benefits to set you up for success in and outside of work. Here's what we offer Generous Leave Benefits: Enjoy generous leave benefits of up to 40 days. Parental Leave: Experience one of the industry's best parental leave policies to spend time with your new addition. Sabbatical Leave Policy: Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered. Health Insurance: We offer health benefits and insurance to you and your family for medically related expenses related to illness, disease, or injury. Pet-Friendly Office*: Spend more time with your treasured friends, even when you're away from home. Bring your furry friends with you to the office and let your colleagues become their friends, too. *Noida office only Creche Facility for children*: Say goodbye to worries and hello to a convenient and reliable creche facility that puts your child's well-being first. *India offices Where and how we work Our Noida office is situated in a posh techspace, equipped with various amenities to support our work environment. Here, we follow a five-day work week schedule, allowing us to efficiently carry out our tasks and collaborate effectively within our team.Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details. Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana

On-site

Principal Software Engineering Manager Hyderabad, Telangana, India Date posted Jun 13, 2025 Job number 1830297 Work site Microsoft on-site only Travel 0-25 % Role type People Manager Profession Software Engineering Discipline Software Engineering Employment type Full-Time Overview The Windows Data Team is responsible for developing and operating one of the world’s largest data eco-systems: PiB data is being processed, stored and accessed every day. In addition to Azure, Fabric, and Microsoft offerings, the team also utilizes modern open-source technologies such as Spark, Starrocks, and ClickHouse. Thousands of developers in Windows, Bing, Ads, Edge, MSN, etc. are working on top of the data products that the team builds. We’re looking for passionate engineers to join us for the mission of powering Microsoft businesses through data substrate and infusing our data capabilities into the industry. We are looking for a Principal Software Engineering Manager who can lead a team to design, develop, and maintain data pipelines and applications using Spark, SQL, map-reduce, and other technologies on our big data platforms. You will work with a team of data scientists, analysts, and engineers to deliver high-quality data solutions that support our business goals and customer needs. You will also collaborate with other teams across the organization to ensure data quality, security, and compliance. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Required Qualifications: Bachelor's Degree in Computer Science or related technical field AND 12+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR Master's Degree in Computer Science or related technical field AND 10+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. 4+ years people management experience. Demonstrate working knowledge of cloud and distributed computing platforms such as Azure or AWS. Strong knowledge and experience with Map Reduce, Spark, Kafka, Synapse, Fabric, or other data processing frameworks. Fluent in English, both written and spoken. Preferred Qualifications: Experience with CosmosDB or other NoSQL databases is a plus. Experience in data engineering, data analysis, or data related fields. Experience with data science and ML tools such as Scikit-learn, R, Azure AI, Pyspark, or similar. Experience with data modeling, data warehousing, and ETL techniques. Experience in designing, developing, and shipping services with secure continuous integration and continuous delivery practices (CI/CD). Relational and/or non-relational (NoSQL) databases. C/C++ and lower-level languages are a plus. #W+Djobs #WindowsIndia #WDXIndia Responsibilities Lead a team of software developers to develop and optimize data pipelines and applications using Spark, Cosmos, Azure, SQL, and other frameworks. Implement data ingestion, transformation, and processing logic using various data sources and formats. Perform data quality checks, testing, and debugging to ensure data accuracy and reliability. Document and maintain data pipeline specifications, code, and best practices. Research and evaluate new data technologies and tools to improve data performance and scalability. Work with world-class engineer/scientist team on Big Data, Analytics and OLAP/OLTP. Embrace both Microsoft technology and cutting-edge open-source technology. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 month ago

Apply

0.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Senior Data Engineer (Contract) Location: Bengaluru, Karnataka, India About the Role: We're looking for an experienced Senior Data Engineer (6-8 years) to join our data team. You'll be key in building and maintaining our data systems on AWS. You'll use your strong skills in big data tools and cloud technology to help our analytics team get valuable insights from our data. You'll be in charge of the whole process of our data pipelines, making sure the data is good, reliable, and fast. What You'll Do: Design and build efficient data pipelines using Spark / PySpark / Scala . Manage complex data processes with Airflow , creating and fixing any issues with the workflows ( DAGs ). Clean, transform, and prepare data for analysis. Use Python for data tasks, automation, and building tools. Work with AWS services like S3, Redshift, EMR, Glue, and Athena to manage our data infrastructure. Collaborate closely with the Analytics team to understand what data they need and provide solutions. Help develop and maintain our Node.js backend, using Typescript , for data services. Use YAML to manage the settings for our data tools. Set up and manage automated deployment processes ( CI/CD ) using GitHub Actions . Monitor and fix problems in our data pipelines to keep them running smoothly. Implement checks to ensure our data is accurate and consistent. Help design and build data warehouses and data lakes. Use SQL extensively to query and work with data in different systems. Work with streaming data using technologies like Kafka for real-time data processing. Stay updated on the latest data engineering technologies. Guide and mentor junior data engineers. Help create data management rules and procedures. What You'll Need: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 6-8 years of experience as a Data Engineer. Strong skills in Spark and Scala for handling large amounts of data. Good experience with Airflow for managing data workflows and understanding DAGs . Solid understanding of how to transform and prepare data. Strong programming skills in Python for data tasks and automation.. Proven experience working with AWS cloud services (S3, Redshift, EMR, Glue, IAM, EC2, and Athena ). Experience building data solutions for Analytics teams. Familiarity with Node.js for backend development. Experience with Typescript for backend development is a plus. Experience using YAML for configuration management. Hands-on experience with GitHub Actions for automated deployment ( CI/CD ). Good understanding of data warehousing concepts. Strong database skills - OLAP/OLTP Excellent command of SQL for data querying and manipulation. Experience with stream processing using Kafka or similar technologies. Excellent problem-solving, analytical, and communication skills. Ability to work well independently and as part of a team. Bonus Points: Familiarity with data lake technologies (e.g., Delta Lake, Apache Iceberg). Experience with other stream processing technologies (e.g., Flink, Kinesis). Knowledge of data management, data quality, statistics and data governance frameworks. Experience with tools for managing infrastructure as code (e.g., Terraform). Familiarity with container technologies (e.g., Docker, Kubernetes). Experience with monitoring and logging tools (e.g., Prometheus, Grafana).

Posted 1 month ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY_ Consulting _Cloud Testing : Staff The opportunity As a Cloud Test Engineer, you will be responsible for testing cloud Solutions on cloud platform and should ensure Quality of deliverables. You will work closely with Test Lead for the Projects under Test. Testing proficiency in Cloud and cloud platform knowledge either of AWS/Azure/GCP are required for this position. Added advantage to have experience in CI/CD platform, Cloud foundation and cloud data platform. Skills And Attributes For Success Delivery of Testing needs for Cloud Projects. Ability to effectively communicate with team members across geographies effectively Experience in Cloud Infrastructure testing. Sound cloud concepts and ability to suggest options Knowledge in any of the cloud platform (AWS/Azure/GCP). Knowledge in Azure Devops / Jenkins / Pipelines Thorough understanding of Requirements and provide feedback on the requirements. Develop Test Strategy for cloud Projects for various aspects like Platform testing , Application testing , Integration Testing and UAT as needed. Provide inputs for Test Planning aligned with Test Strategy. Perform Test Case design, identify opportunity for Test Automation. Develop Test Cases both Manual and Automation Scripts as required. Ensure Test readiness (Test Environment, Test Data, Tools Licenses etc) Perform Test execution and report the progress. Report defects and liaise with development & other relevant team for defect resolution. Prepare Test Report and provide inputs to Test Lead for Test Sign off/ Closure Provide support in Project meetings/ calls with Client for status reporting. Provide inputs on Test Metrics to Test Lead. Support in Analysis of Metric trends and implementing improvement actions as necessary. Handling changes and conducting Regression Testing Generate Test Summary Reports Co-coordinating Test team members and Development team Interacting with client-side people to solve issues and update status Actively take part in providing Analytics and Advanced Analytics Testing trainings in the company To qualify for the role, you must have BE/BTech/MCA/M.Sc Overall 2 to 6 years of experience in Testing Cloud solutions, minimum 2 years of experience in any of the Cloud solutions built on Azure/AWS/GCP Certifications in cloud area is desirable. Exposure in Spark SQL / Hive QL testing is desirable. Exposure in data migration project from on-premise to cloud platform is desirable. Understanding of business intelligence concepts, architecture & building blocks in areas ETL processing, Datawarehouse, dashboards and analytics. Working experience in scripting languages such as python, java scripts, java. Testing experience in more than one of these areas- Cloud foundation, Devops, Data Quality, ETL, OLAP, Reports Exposure with SQL server or Oracle database and proficiency with SQL scripting. Exposure in backend Testing of Enterprise Applications/ Systems built on different platforms including Microsoft .Net and Sharepoint technologies Exposure in ETL Testing using commercial ETL tools is desirable. Knowledge/ experience in SSRS, Spotfire (SQL Server Reporting Services) and SSIS is desirable. Exposure in Data Transformation Projects, database design concepts & white-box testing is desirable. Ideally, you’ll also have Experience/ exposure to Test Automation and scripting experience in perl & shell is desirable Experience with Test Management and Defect Management tools preferably HP ALM or JIRA Able to contribute as an individual contributor and when required Lead a small Team Able to create Test Strategy & Test Plan for Testing Cloud applications/ solutions that are moderate to complex / high risk Systems Design Test Cases, Test Data and perform Test Execution & Reporting. Should be able to perform Test Management for small Projects as and when required Participate in Defect Triaging and track the defects for resolution/ conclusion Good communication skills (both written & verbal) Good understanding of SDLC, test process in particular Good analytical & problem solving or troubleshooting skills Good understanding of Project Life Cycle and Test Life Cycle. Exposure to CMMi and Process improvement Frameworks is a plus. Should have excellent communication skills & should be able to articulate concisely & clearly Should be ready to do an individual contributor as well as Team Leader role What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

7.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY- Consulting -Data Analytics Testing-Senior The opportunity As a Data Analytics Test Lead, you will be responsible for testing Business Intelligence & Data warehousing Solutions both in on premise and cloud platform and should ensure Quality of deliverables. You will work closely with Test Manager for the Projects under Test. Testing proficiency in ETL, data-warehousing, and Business Intelligence area are required for this position. Added advantage to have experience in testing Big Data/unstructured data using Hadoop/Spark framework, cloud platform knowledge either in AWS/Azure, knowledge in predictive analytics, Machine Learning and Artificial intelligence. Skills And Attributes For Success Develop Test Strategy for Testing BI & DWH Projects for various aspects like ETL testing & Reports testing (Front end and Backend Testing), Integration Testing and UAT as needed. Provide inputs for Test Planning aligned with Test Strategy. Perform Test Case design, identify opportunity for Test Automation. Develop Test Cases both Manual and Automation Scripts as required (Preferably Python scripts) Ensure Test readiness (Test Environment, Test Data, Tools Licenses etc) Delivery of Testing needs for BI & DWH Projects. Perform unstructured data / big data testing both in on-premise and cloud platform. Thorough understanding of Requirements and provide feedback on the requirements. Perform Test execution and report the progress. Report defects and liaise with development & other relevant team for defect resolution. Prepare Test Report and provide inputs to Test Manager/ Lead for Test Sign off/ Closure Provide support in Project meetings/ calls with Client for status reporting. Provide inputs on Test Metrics to Test Manager/ Lead. Support in Analysis of Metric trends and implementing improvement actions as necessary. Handling changes and conducting Regression Testing Generate Test Summary Reports Co-coordinating Test team members and Development team in order to resolve the issues Interacting with client-side people to solve issues and update status Able to lead team across geographies Actively take part in providing Analytics and Advanced Analytics Testing trainings in the company To qualify for the role, you must have BE/BTech/MCA/M.Sc Overall 7-10 years of experience in Testing Data warehousing / Business Intelligence solutions, minimum 3 years of experience in Testing BI & DWH technologies and Analytics applications. Experience in Bigdata testing with Hadoop/Spark framework and exposure to predictive analytics testing. Very good understanding of business intelligence concepts, architecture & building blocks in areas ETL processing, Datawarehouse, dashboards and analytics. Experience in cloud AWS/Azure infrastructure testing is desirable. Working experience with python data processing is desirable. Extensive Testing experience in more than one of these areas- Data Quality, ETL, OLAP, Reports Good working experience with SQL server or Oracle database and proficiency with SQL scripting. Experience in backend Testing of Enterprise Applications/ Systems built on different platforms including Microsoft .Net and Sharepoint technologies Experience in ETL Testing using commercial ETL tools is desirable. Knowledge/ experience in SSRS, Spotfire (SQL Server Reporting Services) and SSIS is desirable. Experience/ Knowledge in Data Transformation Projects, database design concepts & white-box testing are desirable. Ideally, you’ll also have Experience with Test Management and Defect Management tools preferably JIRA Able to contribute as an individual contributor and when required Lead a small Team Able to create Test Strategy & Test Plan for Testing Data Migration, BI & DWH applications/ solutions that are moderate to complex / high risk Systems Design Test Cases, Test Data and perform Test Execution & Reporting. Should be able to perform Test Management for small Projects as and when required Participate in Defect Triaging and track the defects for resolution/ conclusion Good communication skills (both written & verbal) Good understanding of SDLC, test process in particular Good analytical & problem solving or troubleshooting skills Good understanding of Project Life Cycle and Test Life Cycle. Exposure to CMMi and Process improvement Frameworks is a plus. Should have excellent communication skills & should be able to articulate concisely & clearly Should be ready to do an individual contributor as well as Team Leader role What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

4.0 - 6.0 years

0 Lacs

Gurgaon, Haryana, India

Remote

Come build at the intersection of AI and fintech. At Ocrolus, we're on a mission to help lenders automate workflows with confidence—streamlining how financial institutions evaluate borrowers and enabling faster, more accurate lending decisions. Our AI-powered data and analytics platform is trusted at scale, processing nearly one million credit applications every month across small business, mortgage, and consumer lending. By integrating state-of-the-art open- and closed-source AI models with our human-in-the-loop verification engine, Ocrolus captures data from financial documents with over 99% accuracy. Thanks to our advanced fraud detection and comprehensive cash flow and income analytics, our customers achieve greater efficiency in risk management, and provide expanded access to credit—ultimately creating a more inclusive financial system. Trusted by more than 400 customers—including industry leaders like Better Mortgage, Brex, Enova, Nova Credit, PayPal, Plaid, SoFi, and Square—Ocrolus stands at the forefront of AI innovation in fintech. Join us, and help redefine how the world's most innovative lenders do business. Summary: Ocrolus is a business that is built around data. Understanding our data and unlocking its value is central to our continued growth and success. We are looking for Lead Data Analyst team to work with our company data and help roll out our new streaming data & self-serve analytics platform. This is a huge opportunity to support our company mission, and we are excited to talk to candidates with a range of interesting backgrounds! We value technical and scientific rigor, as well as strategic engagement, and cultivating a bias toward action. Every member of our Business Analytics team will have interesting problems they are working on. What you'll do: Design and implement data architecture solutions that optimize data processes and improve efficiency. Preferred experience with AWS or Azure cloud architecture & Certifications in Tableau desktop or Snowflake. Create and maintain end-to-end data pipelines from streaming platforms like Kafka to ensure real-time data availability on Snowflake or any other Database/ Data Warehouse Build and optimize OLTP & OLAP dashboards for analytical reporting, enabling data-driven decision-making using any visualization tools such as Power BI or Tableau Write and optimize SQL queries to extract, manipulate, and analyze large datasets for business insights. Extensive knowledge of Python for data integration, manipulation and automation. Lead and mentor a team of data analysts, ensuring high-quality analysis and insights are delivered to stakeholders. Collaborate with cross-functional teams to gather business requirements and translate them into technical specifications. Optimize and streamline existing analytics processes to enhance reporting capabilities. Manage relationships with key stakeholders, understanding their data needs and delivering timely, actionable insights. What you'll bring: Bachelor's or Master's degree in Data Science, Computer Science, Information Systems, or a related field. 4 to 6 years of experience in data analytics, data architecture, and process optimization. Expertise in SQL: writing complex queries, optimizing database performance, and managing large datasets. Experience in building data pipelines from streaming platforms like Kafka and integrating it with Snowflake Proven experience in developing and optimizing OLTP & OLAP dashboards and Proficiency in any visualization tool like Power BI or Tableau High level knowledge of Python for data integration, manipulation and automation. Strong experience with data architecture, process optimization, and ETL pipeline management. Demonstrated experience in team leadership and mentoring junior analysts. Excellent stakeholder management skills with the ability to communicate effectively across technical and non-technical teams. Extra Credit: Experience on working on AWS Cloud Environment & Snowflake database Experience in any of the Market Standard ETL tools like Informatica, ODI etc Life at Ocrolus We're a team of builders, thinkers, and problem solvers who care deeply about our mission — and each other. As a fast-growing, remote-first company, we offer an environment where you can grow your skills, take ownership of your work, and make a meaningful impact. Our culture is grounded in four core values: Empathy – Understand and serve with compassion Curiosity – Explore new ideas and question the status quo Humility – Listen, be grounded, and remain open-minded Ownership – Love what you do, work hard, and deliver excellence We believe diverse perspectives drive better outcomes. That's why we're committed to fostering an inclusive workplace where everyone has a seat at the table, regardless of race, gender, gender identity, age, disability, national origin, or any other protected characteristic. We look forward to building the future of lending together. Show more Show less

Posted 1 month ago

Apply

2.0 - 7.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals Develop, test, and support end-to-end batch and near real-time data flows/pipelines Demonstrate understanding in data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum of 2+ years of related experience required Experience in modeling and business system designs Good hands-on experience on DataStage, Cloud based ETL Services Have great expertise in writing TSQL code Well versed with data warehouse schemas and OLAP techniques Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate Must be a strong team player/leader Ability to lead Data transformation project with multiple junior data engineers Strong oral written and interpersonal skills for interacting and throughout all levels of the organization. Ability to clearly communicate complex business problems and technical solutions.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies