Jobs
Interviews

20 Datawarehouse Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. We're looking for candidates with strong technology and data understanding in the data modeling space, with proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your key responsibilities include employing tools and techniques used to understand and analyze how to collect, update, store, and exchange data. You will define and employ data modeling and design standards, tools, best practices, and related development methodologies. Additionally, you will design, review, and maintain data models, perform data analysis activities to capture data requirements and represent them in data models visualization, manage the life cycle of the data model from requirements to design to implementation to maintenance, work closely with data engineers to create optimal physical data models of datasets, and identify areas where data can be used to improve business activities. Skills and attributes for success: - Experience: 3 - 7 years - Data modeling (relevant Knowledge): 3 years and above - Experience in data modeling data tools including but not limited to Erwin Data Modeler, ER studio, Toad, etc. - Strong knowledge in SQL - Basic ETL skills to ensure implementation meets the documented specifications for ETL processes including data translation/mapping and transformation - Good Datawarehouse knowledge - Optional Visualization skills - Knowledge in DQ and data profiling techniques and tools To qualify for the role, you must: - Be a computer science graduate or equivalent with 3 - 7 years of industry experience - Have working experience in an Agile-based delivery methodology (Preferable) - Have a flexible and proactive/self-motivated working style with strong personal ownership of problem resolution - Possess strong analytical skills and enjoy solving complex technical problems - Have proficiency in Software Development Best Practices - Excel in debugging and optimization skills - Have experience in Enterprise-grade solution implementations & in converting business problems/challenges to technical solutions considering security, performance, scalability, etc. - Be an excellent communicator (written and verbal formal and informal) - Participate in all aspects of the solution delivery life cycle including analysis, design, development, testing, production deployment, and support - Possess client management skills EY exists to build a better working world, helping to create long-term value for clients, people, and society, and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 2 days ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

You have a deep experience in developing data processing tasks using PySpark/spark such as reading data from external sources, merging data, performing data enrichment, and loading into target data destinations. Your responsibilities will include developing, programming, and maintaining applications using the Apache Spark and python open-source framework. You will work with different aspects of the Spark ecosystem, including Spark SQL, DataFrames, Datasets, and streaming. As a Spark Developer, you must have strong programming skills in Python, Java, or Scala. It is essential that you are familiar with big data processing tools and techniques, have a good understanding of distributed systems, and possess proven experience as a Spark Developer or a related role. Your problem-solving and analytical thinking skills should be excellent. Experience with building APIs for provisioning data to downstream systems is required. Working experience on any Cloud technology like AWS, Azure, Google is an added advantage. Hands-on experience with AWS S3 Filesystem operations will be beneficial for this role.,

Posted 3 days ago

Apply

12.0 - 16.0 years

0 Lacs

karnataka

On-site

You have a deep experience in developing data processing tasks using PySpark/spark such as reading data from external sources, merging data, performing data enrichment, and loading into target data destinations. You will be responsible for developing, programming, and maintaining applications using the Apache Spark and Python open-source framework. Your role will involve working with different aspects of the Spark ecosystem, including Spark SQL, DataFrames, Datasets, and streaming. As a Spark Developer, you must have strong programming skills in Python, Java, or Scala. It is essential to be familiar with big data processing tools and techniques, as well as have a good understanding of distributed systems. Your proven experience as a Spark Developer or a related role will be highly valuable in this position. Strong problem-solving and analytical thinking skills are required to excel in this role. Experience with building APIs for provisioning data to downstream systems will be beneficial. Working experience on any Cloud technology like AWS, Azure, or Google is an added advantage. Hands-on experience with AWS S3 Filesystem operations will also be beneficial for this position.,

Posted 4 days ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Job title: Senior Software Engineer Experience: 5- 8 years Primary skills: Python, Spark or Pyspark, DWH ETL. Database: SparkSQL or PostgreSQL Secondary skills: Databricks ( Delta Lake, Delta tables, Unity Catalog) Work Model: Hybrid (Weekly Twice) Cab Facility: Yes Work Timings: 10am to 7pm Interview Process: 3 rounds (3rd round F2F Mandatory) Work Location: Karle Town Tech Park Nagawara, Hebbal Bengaluru 560045 About Business Unit: The Architecture Team plays a pivotal role in the end-to-end design, governance, and strategic direction of product development within Epsilon People Cloud (EPC). As a centre of technical excellence, the team ensures that every product feature is engineered to meet the highest standards of scalability, security, performance, and maintainability. Their responsibilities span across architectural ownership of critical product features, driving techno-product leadership, enforcing architectural governance, and ensuring systems are built with scalability, security, and compliance in mind. They design multi cloud and hybrid cloud solutions that support seamless integration across diverse environments and contribute significantly to interoperability between EPC products and the broader enterprise ecosystem. The team fosters innovation and technical leadership while actively collaborating with key partners to align technology decisions with business goals. Through this, the Architecture Team ensures the delivery of future-ready, enterprise-grade, efficient and performant, secure and resilient platforms that form the backbone of Epsilon People Cloud. Why we are looking for you: You have experience working as a Data Engineer with strong database fundamentals and ETL background. You have experience working in a Data warehouse environment and dealing with data volume in terabytes and above. You have experience working in relation data systems, preferably PostgreSQL and SparkSQL. You have excellent designing and coding skills and can mentor a junior engineer in the team. You have excellent written and verbal communication skills. You are experienced and comfortable working with global clients You work well with teams and are able to work with multiple collaborators including clients, vendors and delivery teams. You are proficient with bug tracking and test management toolsets to support development processes such as CI/CD. What you will enjoy in this role: As part of the Epsilon Technology practice, the pace of the work matches the fast-evolving demands in the industry. You will get to work on the latest tools and technology and deal with data of petabyte-scale. Work on homegrown frameworks on Spark and Airflow etc. Exposure to Digital Marketing Domain where Epsilon is a marker leader. Understand and work closely with consumer data across different segments that will eventually provide insights into consumer behaviour's and patterns to design digital Ad strategies. As part of the dynamic team, you will have opportunities to innovate and put your recommendations forward. Using existing standard methodologies and defining as per evolving industry standards. Opportunity to work with Business, System and Delivery to build a solid foundation on Digital Marketing Domain. The open and transparent environment that values innovation and efficiency Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. What will you do? Develop a deep understanding of the business context under which your team operates and present feature recommendations in an agile working environment. Lead, design and code solutions on and off database for ensuring application access to enable data-driven decision making for the company's multi-faceted ad serving operations. Working closely with Engineering resources across the globe to ensure enterprise data warehouse solutions and assets are actionable, accessible and evolving in lockstep with the needs of the ever-changing business model. This role requires deep expertise in spark and strong proficiency in ETL, SQL, and modern data engineering practices. Design, develop, and manage ETL/ELT pipelines in Databricks using PySpark/SparkSQL, integrating various data sources to support business operations Lead in the areas of solution design, code development, quality assurance, data modelling, business intelligence. Mentor Junior engineers in the team. Stay abreast of developments in the data world in terms of governance, quality and performance optimization. Able to have effective client meetings, understand deliverables, and drive successful outcomes. Qualifications: Bachelor's Degree in Computer Science or equivalent degree is required. 5 - 8 years of data engineering experience with expertise using Apache Spark and Databases (preferably Databricks) in marketing technologies and data management, and technical understanding in these areas. Monitor and tune Databricks workloads to ensure high performance and scalability, adapting to business needs as required. Solid experience in Basic and Advanced SQL writing and tuning. Experience with Python Solid understanding of CI/CD practices with experience in Git for version control and integration for spark data projects. Good understanding of Disaster Recovery and Business Continuity solutions Experience with scheduling applications with complex interdependencies, preferably Airflow Good experience in working with geographically and culturally diverse teams. Understanding of data management concepts in both traditional relational databases and big data lakehouse solutions such as Apache Hive, AWS Glue or Databricks. Excellent written and verbal communication skills. Ability to handle complex products. Good communication and problem-solving skills, with the ability to manage multiple priorities. Ability to diagnose and solve problems quickly. Diligent, able to multi-task, prioritize and able to quickly change priorities. Good time management. Good to have knowledge of cloud platforms (cloud security) and familiarity with Terraform or other infrastructure-as-code tools. About Epsilon: Epsilon is a global data, technology and services company that powers the marketing and advertising ecosystem. For decades, we have provided marketers from the world's leading brands the data, technology and services they need to engage consumers with 1 View, 1 Vision and 1 Voice. 1 View of their universe of potential buyers. 1 Vision for engaging each individual. And 1 Voice to harmonize engagement across paid, owned and earned channels. Epsilon's comprehensive portfolio of capabilities across our suite of digital media, messaging and loyalty solutions bridge the divide between marketing and advertising technology. We process 400+ billion consumer actions each day using advanced AI and hold many patents of proprietary technology, including real-time modeling languages and consumer privacy advancements. Thanks to the work of every employee, Epsilon has been consistently recognized as industry-leading by Forrester, Adweek and the MRC. Epsilon is a global company with more than 9,000 employees around the world.

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are an experienced ETL (SSIS) Developer who will be an integral part of our Data Management team. Your primary responsibility will involve collaborating with various cross-functional teams to comprehend their data requirements and effectively transform raw data from SQL, Oracle, and AWS S3 into our DataWarehouse. Your key job functions and responsibilities will include: - Developing and optimizing ETL workflows to produce valuable data and reports. - Integrating data from SQL database, Oracle database, Flat files, AWS S3. - Troubleshooting and debugging existing SSIS packages to enhance performance and reliability. - Fine-tuning SQL queries and views for improved efficiency. - Working closely with both onshore and offshore data team members. - Engaging with business and sales teams to grasp their data and automation necessities. - Creating and maintaining detailed documentation for all systems and jobs. To excel in this role, you should meet the following qualifications: - Hold a Bachelor's Degree. - Possess a minimum of 5 years of ETL (SSIS) developer experience, with a proven track record of integrating various databases and flat files. - Demonstrate expert proficiency in SQL. - Exhibit strong English communication skills to effectively interact with business stakeholders. - Ideally, have experience in Agile Project Management methodology. - Familiarity with connecting to AWS services is preferred. - Knowledge in AI/ML and proficiency in Python or R for data analysis would be advantageous. Your work schedule will entail approximately 3 hours of overlap between onshore and offshore work hours, from 1:30 PM IST to 10:30 PM IST.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are a skilled FLEXCUBE Reports Developer with expertise in Qlik Sense, responsible for designing, developing, and maintaining reports and dashboards that provide valuable insights from FLEXCUBE core banking data. Your key responsibilities include designing interactive reports and dashboards, developing data models, customizing reports to meet business requirements, optimizing report performance, integrating data from various sources, ensuring data security, providing user training, and maintaining documentation. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential for this role. You should have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with 3 to 7 years of proven experience in developing reports and dashboards using Qlik Sense. Familiarity with FLEXCUBE core banking systems, OLAP Cubes, Data Marts, Datawarehouse, data modelling, data visualization concepts, and strong SQL skills are required. Excellent problem-solving, analytical, communication, and collaboration skills are essential. Banking or financial industry experience and Qlik Sense certifications are beneficial. This role offers an opportunity to work with cutting-edge reporting and analytics tools in the banking sector, requiring close collaboration with business stakeholders and contribution to data-driven decision-making. Candidates with a strong background in FLEXCUBE reports development and Qlik Sense are encouraged to apply, as we are committed to providing a collaborative and growth-oriented work environment.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 - 0 Lacs

karnataka

On-site

You are urgently looking to hire a QA/Test Engineer with experience in BI and Datawarehouse. This opportunity is only open to candidates residing in Bengaluru. The ideal candidate would be available for an immediate start or be able to join in July. A Weekend Virtual Drive is scheduled for this Saturday, 19 July 2023, from 10 am to 5 pm, and candidates must be flexible with the timing. The second round of interviews will be conducted face to face at the Bengaluru office. The salary offered for this position ranges from 10 to 18 lakhs, depending on the candidate's experience level, which should fall within the range of 3 to 5.5 years. As a QA/Test Engineer, you will be responsible for conducting data validation and UI validation after generating BI reports. The role will also involve writing SQL queries and leveraging your experience with PowerBI, Datawarehouse, SQL, and ETL processes. To apply for this position, interested candidates are requested to share their resumes at abhinav@amorsystems.com and CC abhinav.amor@gmail.com. Thank you, HR,

Posted 2 weeks ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Hyderabad

Hybrid

Hiring Snowflake+DBT data engineers! Experience-5+ years Work mode-Hybrid Job location-Hyderabad Role & responsibilities Data Pipeline Development: Design, build, and maintain efficient data pipelines using Snowflake and DBT. Data Modeling: Develop and optimize data models in Snowflake to support analytics and reporting needs. ETL Processes: Implement ETL processes to transform raw data into structured formats using DBT. Performance Tuning: Optimize Snowflake queries and DBT models for performance and scalability. Data Integration: Integrate Snowflake with various data sources and third-party tools. Collaboration: Work closely with data analysts, data scientists, and other stakeholders to understand data requirements and deliver solutions. Data Quality: Implement data quality checks and testing to ensure the accuracy and reliability of data. Documentation: Document data transformation processes and maintain comprehensive records of data models and pipelines. Preferred candidate profile Proficiency in SQL: Strong SQL skills for writing and optimizing queries. Experience with Snowflake: Hands-on experience with Snowflake, including data modeling, performance tuning, and integration. DBT Expertise: Proficient in using DBT for data transformation and modeling. Data Warehousing: Knowledge of data warehousing concepts and experience with platforms like Snowflake. Analytical Thinking: Ability to analyze complex data sets and derive actionable insights. Communication: Strong communication skills to collaborate with cross-functional teams. Problem-Solving: Excellent problem-solving skills and attention to detail.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

9 - 11 Lacs

Mumbai

Work from Office

Responsibilities Direct Responsibilities Application designing / development / testing / support / enhancements / bug fixing. Interact with functional and technical representatives of project teams in order to understand business functionalities, technical modules, integration mechanism and data sources. Prepare test plans and conduct Unit & Regression Testing. Create prototype for proof of concept and business requirements validations. Ensure that the project and organization standards are followed during various phases of software development lifecycle and day-to-day development work. Estimate efforts, schedule for various modules, and meet deadlines. Technical and Release Documentation To work with teams to help solve complex technical problems To ensure that application is of good quality, ensure any issues are fixed on priority. To work with teams to help solve complex technical problems Work towards initiatives to improve processes and delivery efficiency Contribute towards recruitment efforts - both for the team as well as for the organization Contribute towards innovation; suggest new technical practices for efficiency improvement. Conduct Code reviews Contributing Responsibilities Contribute towards innovation, suggest new technical practices to be investigated Contribute towards initiatives to improve processes and delivery Contribute towards recruitment efforts - both for the team as well as for the organization Technical & Behavioral Competencies Mandatory Technical Skills: Strong knowledge on BI4.2 & BI4.3 Webi Report Designer Strong knowledge on BI4.2 & BI4.3 IDT & Universe Designer Strong knowledge on Database SQL query writing (Procedural programming like Procedures, Functions, Triggers etc.) Experience in technical analysis and Database/ Datawarehouse designing Ability & willingness to learn & work on diverse technologies (languages, frameworks, and tools) Thorough understanding of complete Software Development Lifecycle Self-motivated, good interpersonal skills and inclination to constantly upgrade on new technologies and frameworks. Good communication and co-ordination activities.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

15 - 22 Lacs

Pune, Chennai, Bengaluru

Work from Office

Primary: Azure, Databricks, ADF, Pyspark/Python Secondary: Datawarehouse, SAS/Alteryx Must Have • 8+ Years of IT experience in Datawarehouse and ETL • Hands-on data experience on Cloud Technologies on Azure, ADF, Synapse, Pyspark/Python • Ability to understand Design, Source to target mapping (STTM) and create specifications documents • Flexibility to operate from client office locations • Able to mentor and guide junior resources, as needed Nice to Have • Any relevant certifications • Banking experience on RISK & Regulatory OR Commercial OR Credit Cards/Retail

Posted 3 weeks ago

Apply

12.0 - 22.0 years

20 - 35 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Python, Spark, and AWS, RAG, Azure, ETL, Python, Snowflake, Datawarehouse, Databricks, Abinitio, Tableau, SQL, and NoSQL. Good at problem solving, well-versed with overall project architecture and Hands-on Coding exp is mandatory

Posted 1 month ago

Apply

7.0 - 10.0 years

9 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Candidate Requirement 7 years of experience in PL/SQL development. Proficiency in writing complex SQL queries and stored procedures. Experience with performance tuning and optimization techniques. Strong knowledge on Datawarehouse concepts. Good development experience on ETL tool like Informatica. Should be good at SQL queries and PL/SQL concepts. Knowledge on Unix shell scripting and python is added advantage. Experience working on scheduling tools like Autosys. Experience working in agile model. Strong analytical and problem-solving skills.. Hands on experience on IICS is added advantage. Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Hyderabad, Pune, Gurugram

Work from Office

About GSPANN GSPANN is a global IT services and consultancy provider headquartered in Milpitas, California (U.S.A.). With five global delivery centers across the globe, GSPANN provides digital solutions that support the customer buying journeys of B2B and B2C brands worldwide. With a strong focus on innovation and client satisfaction, GSPANN delivers cutting-edge solutions that drive business success and operational excellence. GSPANN helps retail, finance, manufacturing, and high-technology brands deliver competitive customer experiences and increased revenues through our solution delivery, technologies, practices, and operations for each client. For more information, visit www.gspann.com Job Position MicroStrategy Developer Experience - 5+ years Location Hyderabad, Gurgaon, Pune Skills - MicroStrategy, SQL, Datawarehouse, Business Intelligenece JD and required Skills & Responsibilities: 5-8 years of experience in BI MicroStrategy. Strong experience in Data modelling and Business Intelligence tool Strong experience with SQL, Preferably Big Query (GCP) Drive data investigations to deliver a resolution of technical, procedural, and operational issues. Experience in implementing end-to-end Business Intelligence project Should have experience in MicroStrategy Architecture & Public objects Should be able to work on Object Manager, Command Manager etc Admin tools Data Modelling & DWH Utilize Visual Insight & Dossiers to create customized and interactive dashboards to explore the business data. Mobile Dashboards and Mobile apps Professional experience in Agile/Scrum application development using JIRA Perform technical analysis and development / implementation of applications with necessary customization Analyse and suggest improving system productivity, scaling and monitoring Responsible for periodic deployments through recommended tools and methodologies Adhere to coding best practices, Perform self and peer code reviews Provide daily, weekly updates and corresponding status reports to management Exhibit team player ship collaborating with team members and cross scrum teams Thanks & Regards, Heena Ruchwani Senior Team Lead - Recruitment Mobile: +91 8121 080 893 MS Teams: heena.ruchwani@gspann.com

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 22 Lacs

Hyderabad, Pune

Work from Office

Role & responsibilities: Outline the day-to-day responsibilities for this role. Preferred candidate profile: Specify required role expertise, previous job experience, or relevant certifications. Perks and benefits: Mention available facilities and benefits the company is offering with this job.

Posted 1 month ago

Apply

8.0 - 13.0 years

8 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role: ETL Lead Developer 8+ Years Location: Hyderabad Mandatory Skills ETL - Datawarehouse concepts AWS, Glue SQL python SNOWFLAKE CI/CD Tools (Jenkins, GitHub) JD Data Warehouse. In this role you will be part of a team working to develop solutions enabling the business to leverage data as an asset at the bank. As a Lead ETL Developer, you will be leading teams to develop, maintain and enhance code ensuring all IT SDLC processes are documented and practiced, working closely with multiple technologies teams across the enterprise. The Lead ETL Developer should have extensive knowledge of data warehousing cloud technologies. If you consider data as a strategic asset, evangelize the value of good data and insights, have a passion for learning and continuous improvement, this role is for you. Key Responsibilities: Translate requirements and data mapping documents to a technical design. Develop, enhance and maintain code following best practices and standards. Create and execute unit test plans. Support regression and system testing efforts. Debug and problem solve issues found during testing and or production. Communicate status, issues and blockers with project team. Support continuous improvement by identifying and solving opportunities. Basic Qualifications Bachelor degree or military experience in related field (preferably computer science). At least 8+ years of experience in ETL development within a Data Warehouse. Deep understanding of enterprise data warehousing best practices and standards. Strong experience in software engineering comprising of designing, developing and operating robust and highly-scalable cloud infrastructure services. Strong experience with Python/PySpark, DataStage ETL and SQL development. Proven experience in cloud infrastructure projects with hands on migration expertise on public clouds such as AWS and Azure, preferably Snowflake. Knowledge of Cybersecurity organization practices, operations, risk management processes, principles, architectural requirements, engineering and threats and vulnerabilities, including incident response methodologies. Understand Authentication & Authorization Services, Identity & Access Management. Strong communication and interpersonal skills. Strong organization skills and the ability to work independently as well as with a team. Preferred Qualifications AWS Certified Solutions Architect Associate, AWS Certified DevOps Engineer Professional and/or AWS Certified Solutions Architect Professional Experience defining future state roadmaps for data warehouse applications. Experience leading teams of developers within a project. Experience in financial services (banking) industry.

Posted 1 month ago

Apply

6.0 - 11.0 years

20 - 30 Lacs

Udaipur

Work from Office

Tactical responsibilities Accountable for timely completion and quality execution of projects assigned whether as a lead or a substantial contributor. Performs all required tasks including: Envisioning and planning, monitoring project progress and team members' contributions and conducting project meetings, Research (gathering, validating, organizing and analyzing information), Identifying appropriate templates or approach for the deliverables and projects, Contributing to analytics deliverables (reports, dashboards, data marts, data dawarehouse, marketing or training documentation (such as technical specifications, use cases, user guides, technical presentations, communication for decision makers etc.), Validating deliverables against requirements, o Conducting software testing, Assisting and guiding other team members for executing deliverable tasks, Communicating detailed deliverable status to stakeholders, and o Ensuring compliance to standards and best practices. Holds responsibility of one to three (four stretch) projects simultaneously. Contributes in pre-sales by high level requirement understanding, approach design, estimations and building proposals. Technical skills Has experience of successfully managing and contributing to a fair number of software development projects (preferably leading a team or a part of it). Has in-depth knowledge in database design, SSIS, SSAS, reporting tools. Fair knowledge of cloud services like SQL Azure, Dataverse, SQL Datawarehouse Has capability to gather solution requirements, build detailed specifications (functional and/or technical) and work with the customers/users. BE/BTech/MCA or equivalent formal education/ training with above average scores. Interest and broad understanding in a number of advanced topics like data modelling, analytics architecture and relevant business contexts (functional processes, sales, marketing etc.) and technology trends (competing products and their strengths, new versions and their advantages etc.) Managerial Capability to mentor other team members to build work ethic, process awareness and sensitivity to Advaiya value proposition. Should be able to train and guide other team members on technology, business applications and process topics. Proven ability to effectively plan and monitor project execution through work breakdown, clear assignments and active monitoring. Has understanding and appreciation of management reporting (including concepts like team utilization, quality scores, project variance etc.). Ability to build deeper working relationship and mutual trust with team members beyond one's group and location and also current and prospective clients. Communication Capability to proactively and clearly communicating about technical, project and team related matters through appropriate medium. Has good command on spoken and written (American) English. Should be able to create presentations and documents that explain complex technical or strategic concepts (like whitepapers, compete guidance, presentations for decision makers). Has good presentation skills and can conduct technology workshops.

Posted 1 month ago

Apply

7.0 - 12.0 years

13 - 22 Lacs

Chennai, Bengaluru

Work from Office

Talend developer DatawarehouseBI Data Warehouse implementation Unit Testing troubleshooting ETLTalend DataStage ETL Data Catalog cloud database snowflakedeveloping Data Marts,Data warehousing Operational DataStoreDWH concepts,PerformanceTuning Query

Posted 1 month ago

Apply

5.0 - 8.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Role & responsibilities Should Coordinate with team members, Paris counterparts and work independently. Responsible & Accountable to deliver Functional Specifications, Wireframe docs, RCA, Source to Target Mapping, Test Strategy Doc & any other BA artifacts as demanded by the project delivery Understanding the business requirements, discuss with Business users. Should be able to write mapping documents from User stories. Follow project documentation standards. should have very good knowledge of Hands - on SQL. Analysis the Production data and derive KPI for business users Well verse with Jira use for project work. Preferred candidate profile 5+ years of experience in JAVA / Data based projects (Datawarehouse or Datalake) preferably in Banking Domain Able to performing Gap / Root cause analysis Hands-on Business Analysis skill with experience writing Functional Spec Able to performing Gap / Root cause analysis Should be able to convert the Business use case to Mapping sheet of Source to Target & performing Functional validation Should be able to work independently Should be able to debug prod failures, able to provide root cause solution. Having knowledge of SQL / RDBMS concepts Good analytical/ troubleshooting skills to cater the business requirements. Understanding on Agile process would be an added advantage. Effective team player with ability work autonomously and in team with cross-cultural environment. Effective verbal and written communication to work closely with all the stakeholders.

Posted 2 months ago

Apply

5.0 - 7.0 years

8 - 18 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Hybrid

Job position - Data engineer +Power Bi -Must Location - Pune , Mumbai , Bangalore Experience -5 - 7 years Working Mode - Hybrid ( 3 days in week ) Notice Period - Only Immediate Joiner Can Apply ( Serving notice Period candidate max - 15 june 2025 ) PAN Card Number is Mandatory while Proceeding Profile Mandatory Skills - ETL ,Datawarehouse, Azure ,Power Bi ,DAX, and Power Query ( Power Bi is must ) interested candidate please share your cv's on rutuja.s@bwbsol.com / 9850368787

Posted 2 months ago

Apply

3 - 8 years

6 - 16 Lacs

Bengaluru

Work from Office

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com ASAP. 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse, AWS S3, EC2 Preferred skills : Any ETL Tools Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies