Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
delhi
On-site
The Tableau Developer is responsible for the design, development, and implementation of information delivery solutions. You should have a minimum of 3 to 5 years of experience building and optimizing Tableau Dashboards. In addition, you should have experience in Data Blending, Dual Axis Charts, Window Functions, and Filters. It is essential to have advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), as well as working familiarity with a variety of databases. Hands-on experience with Tabcmd scripting and/or TabMigrate to manage content access various Tableau sites is required. Deployment knowledge to external servers outside Tableau online is also necessary. As a Tableau Developer, you will collaborate with business users and analyze user requirements. You will be responsible for creating Tableau-based BI solutions and required supporting architecture (e.g., data marts). It is crucial to create functional & technical documentation related to Business Intelligence solutions. Additionally, you will provide thought leadership, best practices, and standards required to deliver effective Tableau solutions.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description: Business Title Data Engineer Years of Experience Min 3 and max upto 7. Job Description A Software Engineer is curious and self-driven to build and maintain multi-terabyte operational marketing databases and integrate them with cloud technologies. Our databases typically house millions of individuals and billions of transactions and interact with various web services and cloud-based platforms. Once hired, the qualified candidate will be immersed in the development and maintenance of multiple database solutions to meet global client business objectives. Must Have Skills Experience in cloud computing, particularly with one or more of the following platforms: AWS, Azure (preferred), or GCP. Proficiency in Snowflake and Oracle as databases, along with strong SQL skills. Familiarity with ETL tools, with a preference for Informatica. Experience with PySpark and either Python or UNIX shell scripting. Knowledge of workflow orchestration tools such as Tivoli, Tidal, or Stonebranch. A solid understanding of relational and non-relational databases, including when to use each type. Practical knowledge of data structures, databases, data warehousing, data marts, data modeling, and data ingestion and transformation processes. Experience in data warehousing. Strong debugging skills. Familiarity with version control systems (SVN) and project management tools (JIRA). Excellent communication skills. Good To Have Skills Working experience on Agile framework Experience in client communication and exposure to Client Interaction Cross functional team work internally and with external clients Working as a IC Understand the design / task and develop the code. Perform mid to complex level tasks with minimal supervision Documentation Less supervision & guidance from senior resources will be required. Education Qualification Bachelor's or Master Degree. Certification If Any Any Basic level certification in AWS / AZURE / GCP. Azure Preferred Snowflake Associate / Core Shift timing 12 PM to 9 PM and / or 2 PM to 11 PM - IST time zone Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 2 days ago
5.0 - 7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job description 5+ years of proven work experience in data modelling related projects as a Data Modeler. Understand and translate business needs into data models supporting long-term solutions. Ability to understand data relationships and can design data models that reflects these relationships and facilitates efficient ingestion, processing and consumption of Data. Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Experience working with databases including OLAP/OLTP based data modeling. Perform reverse engineering of physical data models from databases and SQL scripts. Analyze data-related system integration challenges and propose appropriate solutions. Experience in market leading cloud platforms such as Google Cloud Platform (GCP) and Amazon Web Services (AWS). Experience working on 3rd normal forms. Excellent problem solving and communication skills; experience in interacting with technical and non-technical stakeholders at all levels. Bachelor&aposs degree in computer science, information technology or equivalent work experience. Show more Show less
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
The position is for an Officer / Assistance Manager based in Mumbai. The ideal candidate should have a qualification of B.E. / MCA / B.Tech / M.sc (I.T.) and an age limit between 25-30 years. You should have a minimum of 2-3 years of ETL development experience with a strong knowledge of ETL ideas, tools, and data structures. It is essential to have the capability to analyze and troubleshoot complex data sets and determine data storage needs. Familiarity with data warehousing concepts to build a data warehouse for internal departments of the organization is required. Your responsibilities will include creating and enhancing data solutions to enable seamless delivery of data, collecting, parsing, managing, and analyzing large sets of data. You will lead the design of the logical data model, implement the physical database structure, and construct and implement operational data stores and data marts. Designing, developing, automating, and supporting complex applications to extract, transform, and load data will be part of your role. You must ensure data quality at the time of ETL, develop logical and physical data flow models for ETL applications, and have advanced knowledge of SQL, Oracle, SQOOP, NIFI tools commands, and queries. Current CTC and Expected CTC should be clearly mentioned. To apply, please email your resume to careers@cdslindia.com with the position applied for in the subject column.,
Posted 4 days ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
Citrin Cooperman offers a dynamic work environment that fosters professional growth and collaboration. We are continuously seeking talented individuals who bring fresh perspectives, a problem-solving mindset, and sharp technical expertise. Our team of collaborative, innovative professionals is ready to support your professional development. At Citrin Cooperman, we offer competitive compensation and benefits, along with the flexibility to manage your personal and professional life to focus on what matters most to you! As a Financial System Data Integration Senior at Citrin Cooperman, you will play a vital role in supporting the design and development of integrations for clients within Workiva's cloud-based information management platform. Working closely with Citrin Cooperman's Integration Manager, you will be responsible for driving project execution, translating strategic target architecture and business needs into executable designs, and technical system solutions. Your contributions will shape the future of how our clients utilize Workiva's platform to achieve success. Key responsibilities of the role include: - Analyzing requirements to identify optimal use of existing software functionalities for automation solutions - Crafting scalable, flexible, and resilient architectures to address clients" business problems - Supporting end-to-end projects to ensure alignment with original design and objectives - Creating data tables, queries (SQL), ETL logic, and API connections between client source systems and the software platform - Developing technical documentation and identifying technical risks associated with application development - Acting as a visionary in data integration and driving connected data solutions for clients - Providing architectural guidance and recommendations to promote successful technology partner engagements - Mentoring and training colleagues and clients - Communicating extensively with clients to manage expectations and report on project status Required Qualifications: - Bachelor's degree in Computer Science, IT, Management IS, or similar with a minimum of 4 years of experience OR at least 7 years of experience without a degree - Proven ability to lead enterprise-level integration strategy discussions - Expertise with API connectors in ERP Solutions such as SAP, Oracle, NetSuite, etc. - Intermediate proficiency with Python, SQL, JSON, and/or REST - Professional experience with database design, ETL tools, multidimensional reporting software, data warehousing, dashboards, and Excel - Experience in identifying obstacles, managing multiple work streams, and effective communication with technical and non-technical stakeholders Preferred Qualifications: - Experience with Workiva's platform - Understanding of accounting activities - Project management experience and leadership skills - Participation in business development activities - Experience in mentoring and training others At Citrin Cooperman, we are committed to providing exceptional service to clients and acting as positive brand ambassadors. Join us in driving innovation, shaping the future of data integration, and making a meaningful impact on our clients" success.,
Posted 1 week ago
5.0 - 12.0 years
0 Lacs
maharashtra
On-site
Model Risk Management (MRM) is part of the Global Risk Management of Citi and is responsible for Independent Oversight of models across the firm. Citi is seeking a Vice President to join the System Strategy and Oversight Team within Model Risk Management Inventory & Initiative Management Group. The role requires experience in Risk management, SDLC, Waterfall, Iterative and Agile methodologies, and expertise in Project Management and Governance. Experience in process reengineering, business architecture, simplification, controls and UAT. Experience in developing solutions driving automation of Gen AI/ modeling tools or building reporting frameworks would be a big plus. Familiarity with FRB's Supervisory Guidance on MRM SR 11-7 and 15-18. The MRM System Strategy & Oversight (SSO) Lead will be responsible to drive reengineering of MRMS, the Citi Model Risk Management System in line with Model Risk Management Policy and Procedures and overall Model Risk system strategy. They will translate policies, procedures, and guidelines into process maps and concrete tasks, identify dependencies, decision points, actors, opportunities for streamlining, etc., and build system solutions to support. The role involves collaborating with various stakeholders both within and outside Risk management to identify, streamline, simplify, and implement model life cycle processes in MRMS. The responsibilities also include authoring Business requirements, re-engineering processes and system solutions to drive simplification and automation, liaising with IT partners to build effective system solutions, and partnering with validation and development groups to drive integration of metrics and documentation digitization, Gen AI POCs with MRMS target state. The ideal candidate should have 12+ years of working experience with 5+ years in product development or equivalent role. They should be familiar with O&T developing cycle as well as with model risk management or similar. Experience in supporting cross-functional projects with project management, technology on system enhancements is required. Additionally, knowledge/experience with process design, database design, and high proficiency in SQL are essential. Institutional knowledge/experience with Citi platforms/application is preferred. Strong interpersonal skills, project management skills, and experience with Python, R, other programming languages for implementing POCs are desired. Expert level knowledge at MS Excel for data analytics including VBA skills; MS PowerPoint for executive presentations; MS Word for business documentation; MS Visio for process flows and swim lane are also expected. A Bachelor's degree in finance, mathematics, computer science or related field is required, with a Master's Degree being preferred. Working at Citi means joining a family of more than 230,000 dedicated people from around the globe. It offers the opportunity to grow your career, give back to your community, and make a real impact. If you are looking to take the next step in your career, consider applying for this role at Citi today.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You are a skilled FLEXCUBE Reports Developer with expertise in Qlik Sense, responsible for designing, developing, and maintaining reports and dashboards that provide valuable insights from FLEXCUBE core banking data. Your key responsibilities include designing interactive reports and dashboards, developing data models, customizing reports to meet business requirements, optimizing report performance, integrating data from various sources, ensuring data security, providing user training, and maintaining documentation. Mastery of the FLEXCUBE 14.7 backend tables and data model is essential for this role. You should have a Bachelor's degree in Computer Science, Information Technology, or a related field, along with 3 to 7 years of proven experience in developing reports and dashboards using Qlik Sense. Familiarity with FLEXCUBE core banking systems, OLAP Cubes, Data Marts, Datawarehouse, data modelling, data visualization concepts, and strong SQL skills are required. Excellent problem-solving, analytical, communication, and collaboration skills are essential. Banking or financial industry experience and Qlik Sense certifications are beneficial. This role offers an opportunity to work with cutting-edge reporting and analytics tools in the banking sector, requiring close collaboration with business stakeholders and contribution to data-driven decision-making. Candidates with a strong background in FLEXCUBE reports development and Qlik Sense are encouraged to apply, as we are committed to providing a collaborative and growth-oriented work environment.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be joining Papigen, a fast-growing global technology services company that focuses on delivering innovative digital solutions through deep industry experience and cutting-edge expertise. The company specializes in technology transformation, enterprise modernization, and dynamic areas such as Cloud, Big Data, Java, React, DevOps, and more. The client-centric approach of Papigen combines consulting, engineering, and data science to help businesses evolve and scale efficiently. Your role as a Senior Data QA Analyst will involve supporting data integration, transformation, and reporting validation for enterprise-scale systems. This position requires close collaboration with data engineers, business analysts, and stakeholders to ensure the quality, accuracy, and reliability of data workflows, particularly in Azure Data Bricks and ETL pipelines. Key responsibilities include collaborating with Business Analysts and Data Engineers to understand requirements and translating them into test scenarios and test cases. You will need to develop and execute comprehensive test plans and scripts for data validation, as well as log and manage defects using tools like Azure DevOps. Your role will also involve supporting UAT and post-go-live smoke testing. You will be responsible for understanding data architecture and workflows, including ETL processes and data movement. Writing and executing complex SQL queries to validate data accuracy, completeness, and consistency will be crucial. Additionally, ensuring the correctness of data transformations and mappings based on business logic is essential. As a Senior Data QA Analyst, you will validate the structure, metrics, and content of BI reports. Performing cross-checks of report outputs against source systems and ensuring that reports reflect accurate calculations and align with business requirements will be part of your responsibilities. To be successful in this role, you should have a Bachelor's degree in IT, Computer Science, MIS, or a related field. You should also possess 8+ years of experience in QA, especially in data validation or data warehouse testing. Strong hands-on experience with SQL and data analysis is required, along with proven experience working with Azure Data Bricks, Python, and PySpark (preferred). Familiarity with data models like Data Marts, EDW, and Operational Data Stores is also necessary. Excellent understanding of data transformation, mapping logic, and BI validation is crucial, as well as experience with test case documentation, defect tracking, and Agile methodologies. Strong verbal and written communication skills are essential, along with the ability to work in a cross-functional environment. Working at Papigen will provide you with the opportunity to work with leading global clients, exposure to modern technology stacks and tools, a supportive and collaborative team environment, and continuous learning and career development opportunities.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be joining a fast-growing global technology services company, Papigen, that specializes in providing innovative digital solutions through industry expertise and cutting-edge technology. As a Senior Data QA Analyst, your primary responsibility will be to ensure the quality, accuracy, and reliability of data workflows in enterprise-scale systems, particularly focusing on Azure Data Bricks and ETL pipelines. This role will require close collaboration with data engineers, business analysts, and stakeholders to validate data integration, transformation, and reporting. Your key responsibilities will include collaborating with Business Analysts and Data Engineers to understand requirements and translate them into test scenarios and test cases. You will develop and execute comprehensive test plans and scripts for data validation, log and manage defects using tools like Azure DevOps, and support UAT and post-go-live smoke testing. Additionally, you will be responsible for understanding data architecture, writing and executing complex SQL queries, validating data accuracy, completeness, and consistency, and ensuring correctness of data transformations based on business logic. In terms of report testing, you will validate the structure, metrics, and content of BI reports, perform cross-checks of report outputs against source systems, and ensure that reports reflect accurate calculations and align with business requirements. To be successful in this role, you should have a Bachelor's degree in IT, Computer Science, MIS, or a related field, along with 8+ years of experience in QA, especially in data validation or data warehouse testing. Strong hands-on experience with SQL and data analysis is essential, and experience working with Azure Data Bricks, Python, and PySpark is preferred. Familiarity with data models like Data Marts, EDW, and Operational Data Stores, as well as knowledge of data transformation, mapping logic, and BI validation, will be beneficial. Experience with test case documentation, defect tracking, and Agile methodologies is also required, along with strong verbal and written communication skills to work effectively in a cross-functional environment. Joining Papigen will provide you with the opportunity to work with leading global clients, exposure to modern technology stacks and tools, a supportive and collaborative team environment, and continuous learning and career development opportunities.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Senior Developer BI at our company, your main objective is to determine business requirements and create business intelligence content using Power BI and data analysis to improve decision-making and performance across all business functions. You will be responsible for reporting to various levels of business management. Your key responsibilities will include designing and developing Power BI dashboards, reports, and alerts for stakeholders, as well as identifying and utilizing external sources of information to enhance our own data. You will work on implementing key recommendations, designing data models, and developing data visualizations to help stakeholders understand trends and insights. Additionally, you will be involved in testing and troubleshooting data models, providing production support, and continuously monitoring the BI solution to align with changing business requirements. To excel in this role, you must be fluent in English and have a strong background in BI design, data visualization best practices, and multiple BI tools such as Power BI and Tableau. Proficiency in SQL Server, including SQL objects development, performance tuning, and data analysis, is essential. You should also have experience with Semantic Layer and Data Marts, as well as exceptional SQL skills and the ability to translate complex quantitative analysis into meaningful business insights. As a team-oriented individual with strong communication, planning, and problem-solving skills, you will be expected to handle multiple projects within deadlines and work effectively under pressure. Your role will also involve interacting with internal and external stakeholders, so possessing cultural awareness, flexibility, and sound work ethics is crucial. The ideal candidate for this position will have a Bachelor's degree in Computer Science, at least 5 years of experience as a BI Developer and Data Analyst, and familiarity with reporting, visualization, and programming languages like SQL, T-SQL, and DAX. Additionally, any certifications related to Business Intelligence or Microsoft Power BI/Azure will be considered advantageous. This role may require site visits as necessary and the ability to work in Middle East summer weather conditions. You will report directly to the Business Intelligence Manager and will be managing multiple tasks with limited resources, making effective time management and resource utilization essential for success.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
Join us as a Data Engineer at Barclays, where you will spearhead the evolution of our infrastructure and deployment pipelines, driving innovation and operational excellence. You will harness cutting-edge technology to build and manage robust, scalable and secure infrastructure, ensuring seamless delivery of our digital solutions. To be successful as a Data Engineer, you should have experience with hands-on experience in Pyspark and a strong knowledge of Dataframes, RDD, and SparkSQL. You should also have hands-on experience in developing, testing, and maintaining applications on AWS Cloud. A strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena) is essential. Additionally, you should be able to design and implement scalable and efficient data transformation/storage solutions using Snowflake. Experience in data ingestion to Snowflake for different storage formats such as Parquet, Iceberg, JSON, CSV, etc., is required. Familiarity with using DBT (Data Build Tool) with Snowflake for ELT pipeline development is necessary. Advanced SQL and PL SQL programming skills are a must. Experience in building reusable components using Snowflake and AWS Tools/Technology is highly valued. Exposure to data governance or lineage tools such as Immuta and Alation is an added advantage. Knowledge of Orchestration tools such as Apache Airflow or Snowflake Tasks is beneficial, and familiarity with Abinitio ETL tool is a plus. Some other highly valued skills may include the ability to engage with stakeholders, elicit requirements/user stories, and translate requirements into ETL components. A good understanding of infrastructure setup and the ability to provide solutions either individually or working with teams is essential. Knowledge of Data Marts and Data Warehousing concepts, along with good analytical and interpersonal skills, is required. Implementing Cloud-based Enterprise data warehouse with multiple data platforms along with Snowflake and NoSQL environment to build data movement strategy is also important. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. The role is based out of Chennai. Purpose of the role: To build and maintain the systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: - Meet the needs of stakeholders/customers through specialist advice and support. - Perform prescribed activities in a timely manner and to a high standard which will impact both the role itself and surrounding roles. - Likely to have responsibility for specific processes within a team. - Lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. - Demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. - Manage own workload, take responsibility for the implementation of systems and processes within own work area and participate in projects broader than the direct team. - Execute work requirements as identified in processes and procedures, collaborating with and impacting on the work of closely related teams. - Provide specialist advice and support pertaining to own work area. - Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. - Deliver work and areas of responsibility in line with relevant rules, regulations, and codes of conduct. - Maintain and continually build an understanding of how all teams in the area contribute to the objectives of the broader sub-function, delivering impact on the work of collaborating teams. - Continually develop awareness of the underlying principles and concepts on which the work within the area of responsibility is based, building upon administrative/operational expertise. - Make judgements based on practice and previous experience. - Assess the validity and applicability of previous or similar experiences and evaluate options under circumstances that are not covered by procedures. - Communicate sensitive or difficult information to customers in areas related specifically to customer advice or day-to-day administrative requirements. - Build relationships with stakeholders/customers to identify and address their needs. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
The Company Our beliefs are the foundation for how you conduct business every day. You live each day guided by our core values of Inclusion, Innovation, Collaboration, and Wellness. Together, our values ensure that you work together as one global team with our customers at the center of everything you do and they push you to ensure you take care of yourselves, each other, and our communities. Job Description Summary: What you need to know about the role: A Business Systems Analyst passionate about delivering quality deliverables in a fast-paced environment with an undivided customer focus. Meet our team: The Finance Technology team consists of a diverse group of well-talented, driven, hive-minded subject matter experts that relentlessly work towards enabling the best-in-class solutions for our customers to transform current state solutions. You will work with this team to set up finance solutions, explore avenues to automate, challenge the status quo, and simplify the current state through transformation. Job Description: Your way to impact Your day to day: - Build scalable systems by leading discussions with the business, understanding the requirements from both Customer and Business, and delivering requirements to the engineering team to guide them in building a robust, scalable solution. - Have hands-on technical experience to support across multiple platforms (GCP, Python, Hadoop, SAP, Teradata, Machine Learning). - Establish a consistent project management framework and develop processes to deliver high-quality software in rapid iterations for business partners in multiple geographies. - Participate in a team that designs, develops, troubleshoots, and debugs software programs for databases, applications, tools, etc. - Experience in balancing production platform stability, feature delivery, and the reduction of technical debt across a broad landscape of technologies. What Do You Need To Bring: - You have consistently high standards, and your passion for quality is inherent in everything you do. - Experience with GCP BQ, SQL, data flow. - 4+ years of relevant experience. - Data warehouses, Data marts, distributed data platforms, and data lakes. - Data Modeling, Schema design. - Reporting/Visualization Looker, Tableau, Power BI. - Knowledge of Statistical and machine learning models. - Excellent structured thinking skills, with the ability to break down multi-dimensional problems. - Ability to navigate ambiguity and work in a fast-moving environment with multiple stakeholders. - We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don't hesitate to apply. Our Benefits: Who We Are: To learn more about our culture and community, visit https://about.pypl.com/who-we-are/default.aspx Commitment to Diversity and Inclusion Any general requests for consideration of your skills, please Join our Talent Community. We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don't hesitate to apply. REQ ID R0115599,
Posted 3 weeks ago
6.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Overall 8+ years of full-time hands-on implementation and subject-matter expertise in SAP BW/BI data warehousing, data marts, reporting data and ETL sub systems for the Enterprise Data warehousing,Involving at least 4 years of full-time experience in BW on HANA Projects and 2 to 4 years of hands-on experience in SAP ABAP developments.
Posted 2 months ago
6.0 - 11.0 years
15 - 25 Lacs
Chennai
Work from Office
Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring ETL - Oracle ODI developers for one of our leading MNC client. PFB the details for your better understanding: ******* Looking for SHORT JOINERS ******* WORK LOCATION: CHENNAI Job Role: ETL ODI Developer EXPERIENCE: 5+ yrs CTC Range: 15 LPA to 25 LPA Work Type: WFO Mandatory Skills : Hands on development experience in ETL using ODI 11G/12C is MUST Proficient in Data migration technique and Data Integration Oracle SQL and PL/SQL programming experience Experience in Data Warehouse and/or Data Marts ******* Looking for SHORT JOINERS ******* Appreciate your valuable references, if any. Thanks & Regards Sathya K GSN Consulting Mob: 8939666794 Mail ID: sathya@gsnhr.net; Web: https://g.co/kgs/UAsF9W
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough