Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8.0 - 13.0 years
7 - 13 Lacs
Bengaluru / Bangalore, Karnataka, India
Remote
We are seeking a highly skilled and experienced Data Architect with strong expertise in data modeling and Snowflake to design, develop, and optimize enterprise data architecture. The ideal candidate will play a critical role in shaping data strategy, building scalable models, and ensuring efficient data integration and governance. Key Responsibilities: Design and implement end-to-end data architecture using Snowflake Develop and maintain conceptual, logical, and physical data models. Define and enforce data architecture standards, best practices, and policies. Collaborate with data engineers, analysts, and business stakeholders to gather requirements and design data solutions. Optimize Snowflake performance including data partitioning, caching, and query tuning. Create and manage data dictionaries, metadata, and lineage documentation. Ensure data quality, consistency, and security across all data platforms. Support data integration from various sources (cloud/on-premises) into Snowflake. Required Skills and Experience: 8+ years of experience in data architecture, data modeling, or similar roles. Hands-on expertise with Snowflake including Snowpipe, Streams, Tasks, and Secure Data Sharing. Strong experience with data modeling tools (e.g., Erwin, ER/Studio, dbt). Proficiency in SQL , ETL/ELT pipelines , and data warehousing concepts . Experience working with structured, semi-structured (JSON, XML), and unstructured data. Solid understanding of data governance, data cataloging, and security frameworks. Excellent analytical, communication, and stakeholder management skills. Preferred Qualifications: Experience with cloud platforms like AWS , Azure , or GCP . Familiarity with data lakehouse architecture and real-time data processing. Snowflake Certification(s) or relevant cloud certifications. Knowledge of Python or scripting for data automation is a plus.
Posted 3 weeks ago
1.0 - 4.0 years
1 - 4 Lacs
Pune, Maharashtra, India
On-site
Duties Responsibilities: Collaborate with cross-functional teams to understand business requirements and translate them into data integration solutions. Develop and maintain ETL/ELT pipelines using modern tools like Informatica IDMC to connect source systems to Snowflake. Ensure data accuracy, consistency, and security in all integration workflows. Monitor, troubleshoot, and optimize data integration processes to meet performance and scalability goals. Support ongoing integration projects, including Salesforce and SAP data pipelines, while adhering to best practices in data governance. Document integration designs, workflows, and operational processes for effective knowledge sharing. Assist in implementing and improving data quality controls at the start of processes to ensure reliable outcomes. Stay informed about the latest developments in integration technologies and contribute to team learning and improvement. Qualifications: Required Skills and Experience: 5+ years of hands-on experience in data integration, ETL/ELT development, or data engineering. Proficiency in SQL and experience working with relational databases such as Snowflake, PostgreSQL, or SQL Server. Familiarity with data integration tools such as FiveTran, Informatica Intelligent Data Management Cloud (IDMC), or similar platforms. Basic understanding of cloud platforms like AWS, Azure, or GCP. Experience working with structured and unstructured data in varying formats (e.g., JSON, XML, CSV). Strong problem-solving skills and the ability to troubleshoot data integration issues effectively. Excellent verbal and written communication skills, with the ability to document technical solutions clearly. Preferred Skills and Experience: Exposure to integrating business systems such as Salesforce or SAP into data platforms. Knowledge of data warehousing concepts and hands-on experience with Snowflake. Familiarity with APIs, event-driven pipelines, and automation workflows. Understanding of data governance principles and data quality best practices. Education: Bachelor s degree in Computer Science, Data Engineering, or a related field, or equivalent practical experience. What We Offer: A collaborative and mission-driven work environment at the forefront of EdTech innovation. Opportunities for growth, learning, and professional development. Competitive salary and benefits package, including support for certifications like Snowflake SnowPro Core and Informatica Cloud certifications.
Posted 3 weeks ago
10.0 - 15.0 years
25 - 40 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
Remote
Responsibilities: Lead and manage an offshore team of data engineers, providing strategic guidance, mentorship, and support to ensure the successful delivery of projects and the development of team members. Collaborate closely with onshore stakeholders to understand project requirements, allocate resources efficiently, and ensure alignment with client expectations and project timelines. Drive the technical design, implementation, and optimization of data pipelines, ETL processes, and data warehouses, ensuring scalability, performance, and reliability. Define and enforce engineering best practices, coding standards, and data quality standards to maintain high-quality deliverables and mitigate project risks. Stay abreast of emerging technologies and industry trends in data engineering, and provide recommendations for tooling, process improvements, and skill development. Assume a data architect role as needed, leading the design and implementation of data architecture solutions, data modeling, and optimization strategies. Demonstrate proficiency in AWS services such as: Expertise in cloud data services, including AWS services like Amazon Redshift, Amazon EMR, and AWS Glue, to design and implement scalable data solutions. Experience with cloud infrastructure services such as AWS EC2, AWS S3, to optimize data processing and storage. Knowledge of cloud security best practices, IAM roles, and encryption mechanisms to ensure data privacy and compliance. Proficiency in managing or implementing cloud data warehouse solutions, including data modeling, schema design, performance tuning, and optimization techniques. Demonstrate proficiency in modern data platforms such as Snowflake and Databricks, including: Deep understanding of Snowflake's architecture, capabilities, and best practices for designing and implementing data warehouse solutions. Hands-on experience with Databricks for data engineering, data processing, and machine learning tasks, leveraging Spark clusters for scalable data processing. Ability to optimize Snowflake and Databricks configurations for performance, scalability, and cost-effectiveness. Manage the offshore team's performance, including resource allocation, performance evaluations, and professional development, to maximize team productivity and morale. Qualifications: Bachelor's degree in computer science, Engineering, or a related field; advanced degree preferred. 10+ years of experience in data engineering, with a proven track record of leadership and technical expertise in managing complex data projects. Proficiency in programming languages such as Python, Java, or Scala, and expertise in SQL and relational databases (e.g., PostgreSQL, MySQL). Strong understanding of distributed computing, cloud technologies (e.g., AWS), and big data frameworks (e.g., Hadoop, Spark). Experience with data architecture design, data modeling, and optimization techniques. Excellent communication, collaboration, and leadership skills, with the ability to effectively manage remote teams and engage with onshore stakeholders. Proven ability to adapt to evolving project requirements and effectively prioritize tasks in a fast-paced environment.
Posted 3 weeks ago
0.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Assistant Manager /Manager - Decision Analyst (SAS, R, Python) This position is responsible for providing reporting & insights on client&rsquos digital banking channel. The digital channel management team is responsible for achieving channel account, balance, and fee revenue growth targets across all business segments and product lines . Additionally, the team is responsible for idea and concept generation, business case development, valuation of new concepts and business execution . You will need to become an expert in providing data support and analysis to help define strategy and tactics to meet portfolio goals. In this role, you will be partnering with leadership, product teams, and the broader digital team Responsibilities Critical thinking skills to come up with the right questions to ask or problems that need to be solved. Ability to define the data necessary to build strategy, solve a problem, or make a recommendation. Ability to quickly learn about our data environment to source data and query large datasets across multiple databases. Overall digital channel performance & insights delivered to key stakeholders and senior leadership. Business case and initiative performance tracking delivered to key stakeholders and broader digital channel management team. Ad-hoc analysis & insights needed to achieve channel performance goals. Analyze the results of the queries to create meaningful insights. Ability to effectively visualize and summarize work product for a variety of audiences. Strong presentation, collaboration, and communication skills. Ability to simply layout clear options and recommendations for decision makers. Qualifications we seek in you! Minimum Q ualifications / Skills Bachelor&rsquos degree in business information systems, computer science, mathematical disciplines, statistics, finance, economics, or other technical degree. Relevant years of experience in financial services. Strong knowledge and working experience in data manipulation tools such as Tableau, SQL, Experience in SAS, R, or Python to query large databases and manipulate large datasets would be an added advantage . Experience presenting analytical findings to a non-technical audience to guide business decision-making. Preferred Q ualifications / Skills Experience in Banking or Finance is preferred. Proficiency in Tableau and SQL SAS BASE language certification is a plus Experience with cloud-based analytics services such as Snowflake and AWS Strong attention to detail and an ability to prioritize work in a fast-paced environment. Ability to manage a queue of deliverables & requests with minimum supervision. Strong communications skills written and verbal. Excellent interpersonal skills. Excellent skills with MS Word, Excel and PowerPoint . Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 3 weeks ago
0.0 years
2 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Key Responsibilities: We are seeking a skilled and detail oriented Data Warehouse Engineer to design build and maintain scalable data warehouse solutions You will be responsible for developing efficient data pipelines integrating diverse data sources ensuring data accuracy and enabling high quality analytics to drive business decisions Responsibilities Design develop and maintain data warehouse architectures and systems Build robust ETL Extract Transform Load processes for structured and unstructured data sources Optimize data models database performance and storage solutions Collaborate with data analysts data scientists and business stakeholders to understand data requirements Implement data quality checks and ensure data governance best practices Develop and maintain documentation related to data warehouse design data flow and processes Monitor system performance and proactively identify areas for improvement Support ad hoc data requests and reporting needs Stay up to date with emerging data technologies and industry best practices Preferred Skills: Technology->ETL & Data Quality->ETL - Others,Technology->Database->Data Modeling,Technology->Data Management - DB->DB2,Technology->Data on Cloud-DataStore->Snowflake
Posted 3 weeks ago
8.0 - 9.0 years
8 - 9 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Overview: We are looking for a skilled Snowflake Developer with 8+ years of experience in developing and managing data warehouse solutions using Snowflake. The ideal candidate should have expertise in stored procedures, SQL scripting, and DBT development using models, macros, and jobs. The candidate should also have a strong understanding of DWH concepts, along with experience in developing ETL solutions and implementing CICD pipelines using Bitbucket, Jenkins, DBT, and Snowflake. Additionally, the candidate should have experience in collaborating with stakeholders to gather requirements, develop logic, and deploy solutions. In This Role, You Will: Manage and maintain the Snowflake platform, ensuring optimal performance and uptime. Design and implement Snowflake architecture, considering best practices for scalability, security, and compliance. Conduct performance optimization activities to ensure efficient use of resources and credits. Oversee governance and compliance practices, enabling the right audit logs and ensuring data security using RBAC, masking etc. Perform POCs to evaluate new features and functionalities. Enable and configure new features on the Snowflake platform. Develop and implement integration design strategies using AWS services such as S3, Lambda, SQS, and Kinesis. Design and implement API-based integrations to ensure seamless data flow between systems. Collaborate with cross-functional teams to ensure the successful implementation of Snowflake projects. Utilize programming languages, particularly Python, to develop custom solutions and automation scripts. Heres What You Need: Proven experience working with Snowflake and AWS cloud platforms. In-depth knowledge of Snowflake architecture, design, and best practices. Strong understanding of compliance and governance practices, with the ability to enable and manage audit logs. Expertise in performance optimization and credit usage management on the Snowflake platform. Experience with AWS services such as S3, Lambda, SQS, and Kinesis. Proficient in API-based integrations and data integration strategies. Strong programming skills, particularly in Python. Excellent collaboration and communication skills, with the ability to work effectively with cross-functional teams. Experience: 8 - 9 years Salary: Not Disclosed Location: Gurugram
Posted 3 weeks ago
11.0 - 12.0 years
11 - 12 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Overview: We are looking for a skilled Snowflake Developer with 8+ years of experience in developing and managing data warehouse solutions using Snowflake. The ideal candidate should have expertise in stored procedures, SQL scripting, and DBT development using models, macros, and jobs. The candidate should also have a strong understanding of DWH concepts, along with experience in developing ETL solutions and implementing CICD pipelines using Bitbucket, Jenkins, DBT, and Snowflake. Additionally, the candidate should have experience in collaborating with stakeholders to gather requirements, develop logic, and deploy solutions. In This Role, You Will: Manage and maintain the Snowflake platform, ensuring optimal performance and uptime. Design and implement Snowflake architecture, considering best practices for scalability, security, and compliance. Conduct performance optimization activities to ensure efficient use of resources and credits. Oversee governance and compliance practices, enabling the right audit logs and ensuring data security using RBAC, masking etc. Perform POCs to evaluate new features and functionalities. Enable and configure new features on the Snowflake platform. Develop and implement integration design strategies using AWS services such as S3, Lambda, SQS, and Kinesis. Design and implement API-based integrations to ensure seamless data flow between systems. Collaborate with cross-functional teams to ensure the successful implementation of Snowflake projects. Utilize programming languages, particularly Python, to develop custom solutions and automation scripts. Heres What You Need: Proven experience working with Snowflake and AWS cloud platforms. In-depth knowledge of Snowflake architecture, design, and best practices. Strong understanding of compliance and governance practices, with the ability to enable and manage audit logs. Expertise in performance optimization and credit usage management on the Snowflake platform. Experience with AWS services such as S3, Lambda, SQS, and Kinesis. Proficient in API-based integrations and data integration strategies. Strong programming skills, particularly in Python. Excellent collaboration and communication skills, with the ability to work effectively with cross-functional teams.
Posted 3 weeks ago
10.0 - 14.0 years
10 - 14 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
The Deployment Manager is responsible for leading and managing the deployment process that involve deployment of new releases, patches, data migration, integration, and transformation. This role requires a strong background in deployment management and a solid understanding of database administration, as well as excellent project management and communication skills. He will work with various teams, including IT, operations, and business units, to ensure that deployment activities are aligned with the project objectives and scope. He will also oversee the quality, performance, and security of the deployed solutions, and provide training and support to end-users and stakeholders when required. Job Responsibilities Lead and manage deployment projects from initiation to closure, ensuring timely and successful implementation of solutions Develop detailed deployment plans, including timelines, milestones, resource allocation, and risk mitigation strategies Coordinate with cross-functional teams, including IT, operations, and business units, to ensure seamless deployment activities and stakeholder satisfaction Monitor and maintain the quality, performance, and security of the deployed solutions, including regular backups, updates, and patching Identify and resolve any issues or challenges that arise during deployment, such as database performance, data quality, or integration errors Maintain clear and effective communication with all stakeholders, providing regular updates on project status, milestones, and any issues that arise Ensure that deployment activities meet quality standards and comply with organizational policies and procedures Prepare and maintain comprehensive deployment documentation, including project plans, status reports, data dictionaries, and post-deployment reviews Provide training and support to end-users and stakeholders to ensure successful adoption of deployed solutions Identify opportunities for process improvements and implement best practices to enhance the efficiency and effectiveness of deployment activities Education BE/B.Tech Master of Computer Application Work Experience Cloud PlatformsAzure, AWS, Oracle Cloud Proficiency in SQL and experience with relational database management systems (e.g., MySQL, Postgres, Redshift, Snowflake, SQL Server) Familiarity with Agile/Scrum framework Strong understanding of CI/CD (Continuous Integration/Continuous Deployment) principles and practices, including experience with CI/CD tools such as Jenkins, GitLab CI etc Bachelor's degree in computer science, Information Technology, Business Administration, or a related field Minimum 10 years of experience in project management and IT deployment, or a related role
Posted 3 weeks ago
5.0 - 10.0 years
5 - 10 Lacs
Noida, Uttar Pradesh, India
On-site
This position is part of the technical leadership in data warehousing and Business Intelligence areas. Someone who can work on multiple project streams and clients for better business decision making especially in the area of Lifesciences/ Pharmaceutical domain. Job Responsibilities Technology Leadership Lead and guide the team independently or with little support to design, implement, and deliver complex cloud data management and BI project assignments. Technical Portfolio Expertise in a range of BI and data hosting technologies like the AWS stack (Redshift, EC2), Snowflake, Spark, Full Stack, Qlik, Tableau, Microstrategy. Project Management Get accurate briefs from the Client and translate into tasks for team members with priorities and timeline plans. Must maintain high standards of quality and thoroughness. Should be able to monitor accuracy and quality of others work. Ability to think in advance about potential risks and mitigation plans. Logical Thinking Able to think analytically, use a systematic and logical approach to analyze data, problems, and situations. Must be able to guide team members in analysis. Handle Client Relationship, P&L Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Education BE/B.Tech Master of Computer Application Work Experience Minimum of 5 years of relevant experience in Pharma domain . Technical: Should have 15 years of hands-on experience in the following tools. Must have working knowledge of at least 2 of the following tools: QlikView, QlikSense, Tableau, Microstrategy, Spotfire . Aware of techniques such as UI design, Report modeling, performance tuning, and regression testing . Basic expertise with MS Excel . Advanced expertise with SQL . Functional: Should have experience in following concepts and technologies: Pharma data sources like IMS, Veeva, Symphony, Cegedim etc. Business processes like alignment, market definition, segmentation, sales crediting, activity metrics calculation. Relevant Experience: 0-2 years of relevant experience in a large/midsize IT services/Consulting/Analytics Company. 1-3 years of relevant experience in a large/midsize IT services/Consulting/Analytics Company. 3-5 years of relevant experience in a large/midsize IT services/Consulting/Analytics Company. 3-5 years of relevant experience in a large/midsize IT services/Consulting/Analytics Company.
Posted 3 weeks ago
4.0 - 7.0 years
4 - 7 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
The Cvent Analytics team is looking to hire a Senior Analyst and is currently accepting applications The selected candidate will typically focus on providing customer support through data-driven analysis. He/she would be partnering with the Sales team based out of the US to drive impactful client conversations using Cvent's powerful sourcing and product data. In This Role, You Will: Collaborate with the Sales and Product teams to diligently monitor and evaluate essential product metrics and performance indicators Partner with the Product team to conduct a thorough analysis of usage patterns and identify trends in product adoption Effectively communicate the data narrative through expert analysis, interpretation, and data visualization, thereby conveying significant insights utilizing PowerPoint or other data visualization tools Analyze and interpret data into comprehensive charts and high-quality graphics, ensuring the effective communication and presentation of analytical insights to internal stakeholders Collaborate with fellow analysts within the team to define and refine customer segmentation, which will serve as a foundation for the support matrix Partner with Product leadership to furnish data-driven recommendations for product enhancements Develop and design scalable market insights and customer insights content suitable for internal office hours, webinars, and industry publications Employ a research-led approach to identify both internal and external factors that influence customer performance Assume full management responsibilities and deliver periodic outputs (repeatable, scalable short analyses for stakeholders), thereby ensuring project success and quality Heres What You Need: 4-7 years of experience in a product or strategy role in the analytics domain Bachelors Degree (in technology, statistics, sciences, or mathematics) and/or Engineering with a good academic record Strong verbal and written communication skills with attention to precision of language and ability to organize information logically Experience working on SQL or Snowflake and Advanced Excel, and any BI visualization tool (Mandatory) Hands-on experience to work on PowerPoint decks and storyboarding skills Good presentation skills to deliver insights to a larger audience Excellent project and time management skills; consultative experience and exposure, proven competence for meeting deadlines, multi-tasking under pressure, and managing work under ambiguity Self-driven and can work with geographically spread teams.
Posted 3 weeks ago
3.0 - 4.0 years
3 - 4 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
At Cvent, we value the diverse perspectives that each individual brings Whether working with a team of colleagues or with clients, we ensure that we foster a culture that celebrates differences and builds on shared connections In This Role, You Will: Technology Implementation and Project Management: Collecting and documenting requirements, planning, risk management, Sandbox implementation and rigorous UAT testing, sign-off and production deployment Solution Architecting: Convert business requirements into technology design/ solutions Provide RCAs, What-If simulations and Impact Analysis on need basis Implementation of large system integrations, Merger and Acquisitions in SF and new business systems in multiple locations Ensure that the whole team meets statutory, regulatory and compliance requirements Additional Responsibilities: Provide regular status reporting, identify, track and mitigate key risks, issues and dependencies including escalations and liaising with the required stakeholders Ensure appropriate program communications are in place to address and engage all stakeholder groups Ensure that a culture of improvement is in place to identify key complaint drivers and that we champion remedial work - working in collaboration with other departments to strengthen process, systems and control that will mitigate issues Implement and maintain performance improvement project as agreed by the goals of the department Front ending Business continuity planning for the local site Support the local implementation of the Global strategy for Service Readiness and delivering suitable go to market and commercial models Heres What You Need: Bachelor's degree in engineering or business discipline, or equivalent experience Strong know-how of SFDC overarching solution including SF Metadata and Salesforce Flow, Lightning Components, SF Service Reporting, Dashboard and Data Migration Salesforce Admin Certification, preferred SF BA, PD1, FinancialForce PSA, CPQ, Gainsight, Snowflake, Sigma Strong skills in complex process analysis, project management, problem solving and business process design with a focus on process efficiency Experienced and well-versed in change management methodologies, System Development Life Cycle, System administration, Call-Center specific technologies, messaging systems and emerging technologies Strong experience and knowledge of agile project, support and constantly evolving the intake process and willingness to quickly understand and embrace new processes and technologies Track record of working cross-functionally to deliver large scale multi-million-dollar change and continuous improvement initiatives with a focus on user/customer experience Exceptional stakeholder management and communication skills with the ability to form relationships in person and in writing with ease Excellent Interpersonal and analytical skills Ability to influence decision making processes and negotiate win-win situations At ease with introducing processes and drive change in an unstructured environment Open for global work timings
Posted 3 weeks ago
8.0 - 10.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
Remote
JOB DESCRIPTION Are you ready to make an impact at DTCC Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development At DTCC, we are at the forefront of innovation in the financial markets. We're committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. Pay and Benefits: Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact you will have in this role: The Enterprise Intelligence Lead will be responsible for building data pipelines using their deep knowledge of Talend, SQL and Data Analysis on the bespoke Snowflake data warehouse for Enterprise Intelligence This role will be in the Claw Team within Enterprise Data & Corporate Technology (EDCT). The Enterprise Intelligence team maintains the firm's business intelligence tools and data warehouse. Your Primary Responsibilities: Working on and leading engineering and development focused projects from start to finish with minimal supervision Providing technical and operational support for our customer base as well as other technical areas within the company that utilize Claw Risk management functions such as reconciliation of vulnerabilities, security baselines as well as other risk and audit related objectives Administrative functions for our tools such as keeping the tool documentation current and handling service requests Participate in user training to increase awareness of Claw Ensuring incident, problem and change tickets are addressed in a timely fashion, as well as escalating technical and managerial issues Following DTCC's ITIL process for incident, change and problem resolution Qualifications: Minimum of 8 years of related experience Bachelor's degree preferred or equivalent experience. Talents Needed for Success: Must have experience in snowflake or SQL Minimum of 5 years of related data warehousing work experience 5+ years managing data warehouses in a production environment. This includes all phases of lifecycle management: planning, design, deployment, upkeep and retirement Strong understanding of star/snowflake schemas and data integration methods and tools Moderate to advanced competency of Windows and Unix-like operating system principles Developed competencies around essential project management, communication (oral, written) and personal effectiveness Working experience in MS Office tools such as Outlook, Excel, PowerPoint, Visio and Project Optimize/Tune source streams, queries, Powerbase Dashboards Good knowledge of the technical components of Claw (i.e. Snowflake, Talend, PowerBI, PowerShell, Autosys) ABOUT THE TEAM IT Architecture and Enterprise Services are responsible for enabling digital transformation of DTCC. The group manages complexity of the technology landscape within DTCC and enhances agility, robustness and security of the technology footprint. It does so by serving as the focal point for all technology architectural activities in the organization as well as engineering a portfolio of foundational technology assets to enable our digital transformation.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.