Jobs
Interviews

23992 Etl Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Syniverse is the world’s most connected company. Whether we’re developing the technology that enables intelligent cars to safely react to traffic changes or freeing travelers to explore by keeping their devices online wherever they go, we believe in leading the world forward. Which is why we work with some of the world’s most recognized brands. Eight of the top 10 banks. Four of the top 5 global technology companies. Over 900 communications providers. And how we’re able to provide our incredible talent with an innovative culture and great benefits. Who We're Looking For The Data Engineer I is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems or building new solutions from ground up. This role will work with developers, architects, product managers and data analysts on data initiatives and ensure optimal data delivery with good performance and uptime metrics. Your behaviors align strongly with our values because ours do. Some Of What You'll Do Scope of the Role: Direct Reports: This is an individual contributor role with no direct reports Key Responsibilities Create, enhance, and maintain optimal data pipeline architecture and implementations. Analyze data sets to meet functional / non-functional business requirements. Identify, design, and implement data process: automating processes, optimizing data delivery, etc. Build infrastructure and tools to increase data ETL velocity. Work with data and analytics experts to implement and enhance analytic product features. Provide life cycle support the Operations team for existing products, services, and functionality assigned to the Data Engineering team. Experience, Education, And Certifications Bachelor’s degree in Computer Science, Statistics, Informatics or related field or equivalent work experience. Software Development experience desired Experience in Data Engineer fields is desired. Experience in building and optimizing big data pipelines, architectures, and data sets: Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL databases, such as PostgreSQL, MySQL, etc. Experience with stream-processing systems: Flink, KSQL, Spark-Streaming, etc. Experience with programming languages, such as Java, Scala, Python, etc. Experience with cloud data engineering and development, such as AWS, etc. Additional Requirements Familiar with Agile software design processes and methodologies. Good analytic skills related to working with structured and unstructured datasets. Knowledge of message queuing, stream processing and scalable big data stores. Ownership/accountability for tasks/projects with on time and quality deliveries. Good verbal and written communication skills. Teamwork with independent design and development habits. Work with a sense of urgency and positive attitude. Why You Should Join Us Join us as we write a new chapter, guided by world-class leadership. Come be a part of an exciting and growing organization where we offer a competitive total compensation, flexible/remote work and with a leadership team committed to fostering an inclusive, collaborative, and transparent organizational culture. At Syniverse connectedness is at the core of our business. We believe diversity, equity, and inclusion among our employees is crucial to our success as a global company as we seek to recruit, develop, and retain the most talented people who want to help us connect the world. Know someone at Syniverse? Be sure to have them submit you as a referral prior to applying for this position.

Posted 17 hours ago

Apply

2.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Proxsoft Technologies LLC is a US-registered tech consultancy delivering cutting-edge solutions in Power BI, Power Platform, AI automation, and custom ERP reporting. We specialize in serving construction, infrastructure, and enterprise clients with deep expertise in systems like Viewpoint Vista, Spectrum, Procore, Acumatica, and Microsoft Dynamics. Role Overview: We are looking for a talented Power BI Developer to join our fast-growing team. You'll work closely with data engineers, ERP analysts, and business users to build interactive, insightful, and scalable dashboards that drive decisions for Fortune 500 clients and fast-scaling businesses. Key Responsibilities: Using best practices, design, develop, and deploy Power BI dashboards, paginated reports, and embedded analytics. Connect, model, and transform data from SQL Server, Excel, SharePoint, and cloud data sources. Collaborate with clients to gather business requirements and translate them into visualizations. Build optimized DAX measures, KPIs, bookmarks, drill-throughs, and dynamic visuals. Work on data modeling, relationship architecture, and performance tuning. Integrate Power BI with Power Automate workflows and Power Apps where needed. Document technical requirements, data dictionaries, and end-user guides. Required Skills: Strong in data modeling (star schema, snowflake), ETL, and relational data concepts. 2+ years of hands-on experience with Power BI Desktop, Power BI Service, and DAX. Proficiency in T-SQL, views, stored procedures, and performance optimization. Experience working with ERP datasets (Viewpoint, Acumatica, Procore, etc. is a huge plus). Understanding of row-level security (RLS) and workspace governance. Exposure to Power Automate, Power Apps, or SSRS / Paginated Reports is a bonus. Nice to Have: Familiarity with Azure Synapse, Dataflows, Power Query (M). Knowledge of embedding Power BI in web apps or portals. Microsoft certification in DA-100 / PL-300. Experience with construction / engineering clients or financial dashboards. What We Offer: Exposure to real enterprise-grade datasets and ERP integrations. Flexible work hours (client projects follow US time zones). Opportunity to work on cutting-edge projects using Power Platform + AI. Rapid career growth with direct mentorship from senior architects and CTO. Paid tools, learning access, and certifications.

Posted 18 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Data Services Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate training and development opportunities for team members to enhance their skills. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services. - Strong understanding of cloud computing principles and architecture. - Experience with application lifecycle management and deployment strategies. - Familiarity with data integration and ETL processes. - Knowledge of security best practices in cloud environments. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Data Services. - This position is based at our Pune office. - A 15 years full time education is required.

Posted 18 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Data Services Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project goals are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also be responsible for maintaining communication with stakeholders to provide updates and gather feedback, ensuring that the applications meet the required specifications and quality standards. Your role will be pivotal in driving the success of the projects you oversee, fostering a collaborative environment, and mentoring team members to enhance their skills and performance. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate training sessions to enhance team capabilities and knowledge sharing. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services. - Strong understanding of cloud computing principles and architecture. - Experience with data integration and ETL processes. - Familiarity with application development frameworks and methodologies. - Ability to troubleshoot and resolve technical issues efficiently. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Data Services. - This position is based in Pune. - A 15 years full time education is required.

Posted 18 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft SQL Server, Firewall, EPO Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the needs of stakeholders effectively. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft SQL Server. - Strong understanding of database design and management. - Experience with performance tuning and optimization of SQL queries. - Familiarity with data integration and ETL processes. - Ability to troubleshoot and resolve database-related issues. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft SQL Server. - This position is based in Pune. - A 15 years full time education is required.

Posted 18 hours ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

🔍 Job Title: Treasure Data Engineer (6+ Years Experience) 📍 Location: PAN India (Hybrid as per project needs) 🕒 Experience Required: 6+ Years 📢 We’re Hiring! We are looking for an experienced Treasure Data Engineer with 6+ years of relevant experience in building, maintaining, and optimizing large-scale customer data platforms (CDPs) using Treasure Data. This is an exciting opportunity to work with a leading organization on cutting-edge data engineering and marketing tech solutions. ✅ Required Skills: 6+ years of experience in data engineering, with at least 2+ years of hands-on experience with Treasure Data/CDPs Strong knowledge of SQL, Python/JavaScript, and data integration best practices Experience working with Treasure Workflow, Data Connectors, Segmentations, and Audience Building Experience integrating data from various sources like Salesforce, Google Analytics, Adobe, etc. Knowledge of ETL pipelines, data quality, and customer data activation Familiarity with cloud platforms (AWS/GCP) and marketing automation tools is a plus 🎯 Responsibilities: Design and implement workflows, pipelines, and data transformations in Treasure Data Collaborate with cross-functional teams to integrate customer data sources Optimize performance of existing workflows and queries Support end-users in audience building, data analysis, and campaign execution Ensure data accuracy, security, and compliance

Posted 19 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Primary Responsibility: Collect, clean, and analyze data from various sources. Assist in creating dashboards, reports, and visualizations. We are looking for a SQL Developer Intern to join our team remotely. As an intern, you will work with our database team to design, optimize, and maintain databases while gaining hands-on experience in SQL development. This is a great opportunity for someone eager to build a strong foundation in database management and data analysis. Responsibilities Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 21 hours ago

Apply

0.0 - 5.0 years

0 - 0 Lacs

Chennai, Tamil Nadu

On-site

Job Description –ODI Developer Location : Equitas Office, Backside Vikatan Office, 757, Vasan Ave, Anna Salai, Thousand Lights, Chennai, Tamil Nadu 600002 Job Type: Full-Time Experience: 5+ years Job Summary: We are hiring a Lead Data Engineer to architect and lead enterprise data integration initiatives. This role requires deep technical expertise in data engineering and leadership experience. Familiarity with Oracle Data Integrator (ODI) is preferred, especially in environments using the Oracle stack. Key Responsibilities: Architect and oversee the implementation of scalable, reliable data pipelines. Define standards and best practices for data integration and ETL development. Lead a team of data engineers and mentor junior staff. Collaborate with stakeholders to understand business data needs and translate them into technical solutions. Ensure adherence to data governance, security, and compliance requirements. Requirements: 5+ years of experience in data engineering, including team leadership roles. Deep knowledge of ETL architecture and data integration frameworks. Experience with any ETL tool (ODI is mandatory). Strong SQL, data modeling, and performance tuning skills. Experience with cloud data platforms and modern data architectures. Excellent leadership, communication, and stakeholder management skills. Knowledge on real-time or near-real-time data streaming (e.g., Kafka). Job Type: Full-time Pay: ₹12,817.62 - ₹60,073.88 per month Benefits: Health insurance Provident Fund Experience: 5S: 5 years (Preferred) Location: Chennai, Tamil Nadu (Required) Work Location: In person

Posted 21 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. 🎯 Role Overview Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 21 hours ago

Apply

6.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Job Title: Data Engineer ( GCP ) Employment Type: Full-time Work Mode: Work from Office Experience Required: 6+ Years Location: Ahmedabad / Gurugram Timing: General About the Role We are looking for an experienced Data Engineer to design, build, and optimize data systems. The ideal candidate will have strong expertise in Python, SQL, and cloud platforms, along with a passion for solving complex data challenges. Key Responsibilities Provide business analytics support to management. Analyze business results and design data collection studies. Build and maintain data pipelines and ETL processes using Python . Collaborate with analysts and data scientists to ensure data quality. Optimize database performance (indexing, partitioning, query optimization). Implement data governance and security measures. Monitor and troubleshoot data pipelines, ensuring validation and accuracy. Maintain documentation for workflows and processes. Skills Required Proficiency in Python and SQL . Experience with relational databases (MySQL, PostgreSQL, SQL Server). Knowledge of data modeling, data warehousing, and data architecture. Experience with cloud platforms ( GCP ). Proficiency in Google Cloud Platform (BigQuery, GCS) . Familiarity with version control ( Git ). What We Offer Competitive salary and industry-standard benefits. Opportunity to earn stock options in the near future. Career growth in cloud technologies with certifications in GCP . A chance to be part of a fast-growing and innovative team.

Posted 22 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title : PYTHON DEVELOPER WITH SQL AND ETL Key Skills : Python with Sql, Pyspark,Data bricks, ETL. Job Locations : Hyderabad , Pune , Bengaluru Experience :6-8 Education Qualification : Any Graduation. Work Mode : Hybrid. Employment Type : Contract. Notice Period : Immediate Job responsibilities: The candidate should have 4 yrs and above exp in Python development with SQL. Understanding of Pyspark,Data bricks . Passionate about ETL Development and problem solutions. Quickly learn new data tools and ideas Proficient in skills - Python with Sql,Pyspark,Data bricks,ETL ,AWS knowledge would be an added advantage. The candidate should be well aware of Data ways of working Knowledge of different application dev , understanding to data background.

Posted 22 hours ago

Apply

4.0 - 8.0 years

0 Lacs

punjab

On-site

The Associate Manager - BBS Analytics will be responsible for building Tableau Analytics Dashboards for multiple Global Internal Financial Controls Metrics. You will work with teams within Bunge Business Services to enable full visibility of Bunge's Internal Financial Controls. Your primary task will be to transform business and process data into actionable insights for business disclosures, decisions, and opportunities using data engineering and visualization tools, with a focus on expertise in visualization tool Tableau and Oracle SQL. You will be responsible for designing and delivering various reports, standard Tableau dashboards, ad hoc reports, templates, scorecards, and metrics to drive insights focused on business issues and priorities. Additionally, you will implement and automate business needs on the Online Business Intelligence tool for real-time Control effectiveness and efficiency analytics. It will be crucial for you to understand all aspects of Bunge's Control Metrics, especially reporting and compliance needs. You will collaborate with various stakeholders both internally and externally, with a strong emphasis on building partnerships and appropriately influencing to gain commitment. In this role, you will drive results through high standards, focus on key priorities, organization, and preparing others for change. Your technical skills should encompass a strong working knowledge of Accounting, ESG, Procurement, Agri contracts, SAP FICO/SD/MM, with business process knowledge of Finance Operations, business intelligence/reporting, data analysis and visualization. Additionally, you should have detailed knowledge and experience in BI, Reporting, Analysis, Data Visualization, and Visual Storytelling. The ability to make complex data science models and statistical inferences information clear and actionable will be essential. You should have extensive understanding of Controls Processes, Performance Metrics, and Governance, with significant experience driving large projects to successful completion. Being an Agile Practitioner and having Design Thinking expertise will be advantageous. Strong communication and presentation skills, collaboration skills, and integrity to hold self and others accountable to deliver against commitments are important attributes for this role. You will lead client engagements and oversee work-streams related to PTP, OTC, RTR. Additionally, you will develop solutions to customer challenges, identify gaps, and areas of improvement for dashboard building. Your responsibilities will include gathering requirements from functional stakeholders, conducting UAT with business users, working with Ops team to deploy the use case in production, and engaging with operations team to streamline and improve technical environment, access provisioning, and reporting processes. Managing engagement economics, project resources, team utilization, and delivering high-quality deliverables will be part of your role. You should have a strong competency in Tableau, Oracle, Python, R, MS Excel & PowerPoint and working knowledge of other enabling tools for a business services command center. Competencies in Data Analytics and Big Data tools and platforms will be beneficial. A relevant experience of 4 to 8 years with a Masters in Business Analytics, Finance, ESG, or Data Science from a premier institute/university will be preferred. Bunge (NYSE: BG) is a world leader in sourcing, processing, and supplying oilseed and grain products and ingredients. Founded in 1818, Bunge's expansive network feeds and fuels a growing world, creating sustainable products and opportunities for more than 70,000 farmers and the consumers they serve across the globe. The company is headquartered in St. Louis, Missouri and has 25,000 employees worldwide who stand behind more than 350 port terminals, oilseed processing plants, grain facilities, and food and ingredient production and packaging facilities around the world.,

Posted 23 hours ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your role will involve 4-5 years of ETL testing and data validation. You should be experienced in ETL Testing, Requirement Analysis, Test Planning, Data Validation, and Defect Tracking. Proficiency in SQL is required for writing complex queries to validate data. Additionally, knowledge of ETL tools, experience with data warehousing concepts and methodologies, and strong analytical and problem-solving skills are desirable. Exposure to Agile/Scrum methodologies and experience in AWS, PySpark, Databricks, or any other cloud-based test execution would be beneficial. Your profile should include experience in ETL Testing, Requirement Analysis, Test Planning, Data Validation, and Defect Tracking. Proficiency in SQL for writing complex queries to validate data is essential. At Capgemini, you will have the opportunity to make a difference for the world's leading businesses or for society. You will receive the support needed to shape your career in a way that works for you. When the future doesn't look as bright as you'd like, you will have the opportunity to make a change and rewrite it. By joining Capgemini, you become part of a diverse collective of free-thinkers, entrepreneurs, and experts all working together to unleash human energy through technology for an inclusive and sustainable future. Capgemini values its people and offers extensive Learning & Development programs for career growth. The work environment is inclusive, safe, healthy, and flexible to bring out the best in you. You can also take an active role in Corporate Social Responsibility and Sustainability initiatives to make a positive social change and build a better world. Capgemini is a global business and technology transformation partner with over 55 years of heritage, trusted by clients to unlock the value of technology and address their business needs. The company has a diverse group of 340,000 team members in more than 50 countries. Capgemini delivers end-to-end services and solutions leveraging AI, cloud, data, and deep industry expertise to create tangible impact for enterprises and society. The Group reported 2023 global revenues of 22.5 billion. Skills required for this role include SQL, ETL, Python, Scala, and SQL + ETL.,

Posted 23 hours ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will also be required to monitor and control all phases of the development process, provide user and operational support on applications to business users, and recommend and develop security measures in post-implementation analysis. As the Applications Development Senior Programmer Analyst, you will utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, evaluate business and system processes, recommend advanced programming solutions, and ensure that essential procedures are followed. Additionally, you will serve as an advisor or coach to new or lower-level analysts, operate with a limited level of direct supervision, and act as a subject matter expert to senior stakeholders and other team members. To qualify for this role, you should have 8-12 years of relevant experience in systems analysis and programming of software applications, managing and implementing successful projects, and working knowledge of consulting/project management techniques/methods. You should also have the ability to work under pressure, manage deadlines, and adapt to unexpected changes in expectations or requirements. A Bachelor's degree or equivalent experience is required for this position. In addition to the general job description, the ideal candidate should have 8 to 12 years of Application development experience through the full lifecycle with expertise in UI architecture patterns such as Micro Frontend and NX. Proficiency in Core Java/J2EE Application, Data Structures, Algorithms, Hadoop, Map Reduce Framework, Spark, YARN, and other relevant technologies is essential. Experience with Big Data Spark ecosystem, ETL, BI tools, agile environment, test-driven development, and optimizing software solutions for performance and stability is also preferred. This job description provides an overview of the responsibilities and qualifications for the Applications Development Senior Programmer Analyst role. Other job-related duties may be assigned as required.,

Posted 23 hours ago

Apply

5.0 years

0 Lacs

Haryana, India

On-site

Senior Data Engineer (C11) Analytics & Information Management (AIM), Gurugram Excited to grow your career? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position, you see is right for you, we encourage you to apply! Our people make all the difference in our success. We are seeking a highly experienced and strategic Officer – Sr. Data Engineer for Data/Information Management Team. The ideal candidate will be responsible for development and implementation of data analytics solutions to support key business objectives for Legal Operations as part of COO (Chief Operating Office). This role requires proven track record of implementing optimized data processes/platforms, delivering impactful insights, and fostering a data-driven culture. ------------------------------------------------------ The Data/Information Analyst accomplishes results by contributing significantly to the bank's success by leveraging data engineering & solution design skills within specialized domain. Integrates subject matter and industry expertise within a defined area. Contributes to standards around which others will operate. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the entire function. Requires basic commercial awareness. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Has responsibility for volume, quality, timeliness and delivery of end results of an area. Responsibilities: Incumbents would be primarily responsible for supporting Business Execution activities Chief Operating Office, implement data engineering solutions to manage banking operations. Establish monitoring routines, scorecards and escalation workflows Oversee the Data Strategy, Smart Automation, Insight Generation, Data Quality and Reporting activities using proven analytical techniques. Responsible for documenting data requirements, data collection / processing / cleaning, which may include Process Automation / Optimization and data visualization techniques. Enable proactive issue detection, escalation workflows, and alignment with firmwide Data Related policies, Implement a governance framework with clear stewardship roles and data quality controls Interface between business and technology partners for digitizing data collection, including performance generation, validation rules for banking operations. Build Data Strategy by identifying all relevant product processors, create Data Lake, Data Pipeline, Governance & Reporting Communicate findings and recommendations to senior management. Stay current with the latest trends and technologies in analytics. Ensure compliance with data governance policies and regulatory requirements. Setup a governance operating framework to enable operationalization of data domains, identify CDEs and Data Quality rules. Align with Citi Data Governance Policies and firmwide Chief Data Office expectations Incumbents work with large and complex data sets (both internal and external data) to evaluate, recommend, and support the implementation of business strategies like Centralized data repository with standardized definitions and scalable data pipes Identifies and compiles data sets using a variety of tools (e.g. SQL, Access) to help predict, improve, and measure the success of key business to business outcomes. Implement rule-based Data Quality checks across critical data points. Automate alerts for breaks and publish periodic quality reports Incumbents in this role may often be referred to as Data Analyst. Develop and execute the analytics strategy – Data Ingestion, Reporting / Insights Centralization, Ensure consistency, lineage tracking, and audit readiness across legal reporting Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency, as well as effectively supervise the activity of others and create accountability with those who fail to maintain these standards. Work as a senior member in a team of data engineering professionals, working with them to deliver on organizational priorities Qualifications: 5+ years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like Python, PySpark, Tableau, MicroStrategy, SQL etc. Strong understanding of Data Transformation – Data Strategy, Data Architecture, Data Tracing & Lineage (ability to trace data lineage from source systems to data warehouse to reports and dashboards), Scalable Data Flow Design and Standardization, Platform Integration, ETL & Smart Automation Conceptual, logical, and physical data modeling expertise. Proficiency in relational and dimensional data modeling techniques. Ability and experience in designing data warehouses, integrated data marts, and optimized reporting schemas that cater to multiple BI tools Database Management & Optimization. Expertise in database performance tuning and optimization for data enrichment and integration, reporting and dashboarding Strong understanding of data platforms / ecosystem, establish a scalable data management framework – data provisioning, process optimization, actionable insights, visualization techniques using Tableau Solution Architect with proven ability to translate complex data flows into automated & optimized solutions. Ability to leverage data analytics tools & techniques for analytics problem solving for organizational needs Experience in Developing and Deploying AI solutions in partnership with Tech and Business Experience with any banking operations (e.g., expense analytics, movement of funds, cash flow management, fraud analytics, ROI). Knowledge of regulatory requirements related to data privacy and security Experience in interacting with senior stakeholders across the organization to be able to manage end-to-end conceptualization & implementation of data strategies - standardization data structures, identify and remove redundancies to optimize data feeds AI / Gen AI proficiency and thought leadership in Financial/Business Analysis and/or credit/risk analysis with ability to impact key business drivers via a disciplined analytic process Demonstrate Analytics thought leadership skills & project planning capabilities In-depth understanding of the various financial service business models, expert knowledge of advanced statistical techniques and how to apply the techniques to drive substantial business results Creative problem-solving skills Education: Bachelors/University degree in STEM, Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Time Type :Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Job Level :C11 ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Data/Information Management ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills MicroStrategy, Python (Programming Language), Structured Query Language (SQL), Tableau (Software). ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Haryana, India

On-site

Excited to grow your career? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position, you see is right for you, we encourage you to apply! Our people make all the difference in our success. We are seeking a highly experienced and strategic Officer – Sr. Data Engineer for Data/Information Management Team. The ideal candidate will be responsible for development and implementation of data analytics solutions to support key business objectives for Legal Operations as part of COO (Chief Operating Office). This role requires proven track record of implementing optimized data processes/platforms, delivering impactful insights, and fostering a data-driven culture. ------------------------------------------------------ The Data/Information Analyst accomplishes results by contributing significantly to the bank's success by leveraging data engineering & solution design skills within specialized domain. Integrates subject matter and industry expertise within a defined area. Contributes to standards around which others will operate. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the entire function. Requires basic commercial awareness. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Has responsibility for volume, quality, timeliness and delivery of end results of an area. Responsibilities: Incumbents would be primarily responsible for supporting Business Execution activities Chief Operating Office, implement data engineering solutions to manage banking operations. Establish monitoring routines, scorecards and escalation workflows Oversee the Data Strategy, Smart Automation, Insight Generation, Data Quality and Reporting activities using proven analytical techniques. Responsible for documenting data requirements, data collection / processing / cleaning, which may include Process Automation / Optimization and data visualization techniques. Enable proactive issue detection, escalation workflows, and alignment with firmwide Data Related policies, Implement a governance framework with clear stewardship roles and data quality controls Interface between business and technology partners for digitizing data collection, including performance generation, validation rules for banking operations. Build Data Strategy by identifying all relevant product processors, create Data Lake, Data Pipeline, Governance & Reporting Communicate findings and recommendations to senior management. Stay current with the latest trends and technologies in analytics. Ensure compliance with data governance policies and regulatory requirements. Setup a governance operating framework to enable operationalization of data domains, identify CDEs and Data Quality rules. Align with Citi Data Governance Policies and firmwide Chief Data Office expectations Incumbents work with large and complex data sets (both internal and external data) to evaluate, recommend, and support the implementation of business strategies like Centralized data repository with standardized definitions and scalable data pipes Identifies and compiles data sets using a variety of tools (e.g. SQL, Access) to help predict, improve, and measure the success of key business to business outcomes. Implement rule-based Data Quality checks across critical data points. Automate alerts for breaks and publish periodic quality reports Incumbents in this role may often be referred to as Data Analyst. Develop and execute the analytics strategy – Data Ingestion, Reporting / Insights Centralization, Ensure consistency, lineage tracking, and audit readiness across legal reporting Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency, as well as effectively supervise the activity of others and create accountability with those who fail to maintain these standards. Work as a senior member in a team of data engineering professionals, working with them to deliver on organizational priorities Qualifications: 5+ years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like Python, PySpark, Tableau, MicroStrategy, SQL etc. Strong understanding of Data Transformation – Data Strategy, Data Architecture, Data Tracing & Lineage (ability to trace data lineage from source systems to data warehouse to reports and dashboards), Scalable Data Flow Design and Standardization, Platform Integration, ETL & Smart Automation Conceptual, logical, and physical data modeling expertise. Proficiency in relational and dimensional data modeling techniques. Ability and experience in designing data warehouses, integrated data marts, and optimized reporting schemas that cater to multiple BI tools Database Management & Optimization. Expertise in database performance tuning and optimization for data enrichment and integration, reporting and dashboarding Strong understanding of data platforms / ecosystem, establish a scalable data management framework – data provisioning, process optimization, actionable insights, visualization techniques using Tableau Solution Architect with proven ability to translate complex data flows into automated & optimized solutions. Ability to leverage data analytics tools & techniques for analytics problem solving for organizational needs Experience in Developing and Deploying AI solutions in partnership with Tech and Business Experience with any banking operations (e.g., expense analytics, movement of funds, cash flow management, fraud analytics, ROI). Knowledge of regulatory requirements related to data privacy and security Experience in interacting with senior stakeholders across the organization to be able to manage end-to-end conceptualization & implementation of data strategies - standardization data structures, identify and remove redundancies to optimize data feeds AI / Gen AI proficiency and thought leadership in Financial/Business Analysis and/or credit/risk analysis with ability to impact key business drivers via a disciplined analytic process Demonstrate Analytics thought leadership skills & project planning capabilities In-depth understanding of the various financial service business models, expert knowledge of advanced statistical techniques and how to apply the techniques to drive substantial business results Creative problem-solving skills Education: Bachelors/University degree in STEM, Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Time Type :Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Job Level :C11 ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Data/Information Management ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills MicroStrategy, PySpark, Python (Programming Language), Structured Query Language (SQL). ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 day ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

The Finance Data & Insights Team is an agile product team responsible for the development, production, and transformation of Financial data and reporting across Consumer and Community Banking. The vision of the team is to enhance the lives of the people and increase value to the firm by harnessing the power of data and utilizing the best tools to analyze data, generate insights, save time, improve processes & control, and lead the organization in developing skills for the future. The overall product objectives include constructing a data environment that enables cross-business, product, customer-centric decision making and reporting needs across Consumer and Community Banking in a consistent framework, creating an ecosystem of dashboards sourced from authoritative data sources to replace manual management reporting, and eliminating user tools to save time and increase the ability to generate insights through data and dashboards. As a Data Domain Architect in CBB- field performance area, you will play a key role in bringing transformation in how Financial, Operational, and Behavioral Data are reported and analyzed across Consumer Banking and Business Banking within CCB. Your responsibilities will include but are not limited to discovering, sourcing, designing, and delivering data domains into the Databricks powered Data Mart, enabling the Finance function to support their analytical and reporting needs. You will need to understand the needs of Finance and make data discoverable and available for analytical and reporting purposes. Job Responsibilities: - Conduct comprehensive data discoveries, sourcing, and maintenance of Financial and Operational data to support Field Performance Reporting and analytical needs. - Develop detailed Data Requirement documentation aligned with overall data strategies and models. - Collaborate with the Technology team to develop and test data wrangling workflows, ensuring validation of business logic and outcomes. - Perform integration and regression testing of data components, ensuring compliance and control measures. - Proactively identify and resolve issues/challenges, highlighting potential risks to leadership. - Engage closely with end-users and IT during the UAT phase to validate that production results meet business requirements. - Serve as a subject matter expert in relevant areas, providing support and guidance to other team members. Required qualifications, capabilities, and skills: - Bachelor's degree in MIS, Computer Science, Mathematics, Engineering, Statistics, or a related quantitative field. - Over 10 years of experience in financial solutions, data engineering, data science, or business intelligence within the financial services domain. - Proven experience in building data models that accurately represent business requirements and ensure data integrity, with a strong understanding of data governance principles. - Expertise in database queries including SQL and NoSQL, proficient in ETL techniques. - Solid understanding of data warehousing concepts, design principles, reporting development, and testing. - Deep industry or business domain knowledge relevant to the organization, proficiency in tools such as Databricks/Snowflake, Alteryx, Tableau/ThoughtSpot. - Awareness of technologies and frameworks for handling large data volumes and familiarity with analytical tools. - Ability to think beyond raw data, understand the business context, and identify business opportunities within data. - Strong written and oral communication skills, ability to engage stakeholders across technology, data, and business functions. - Capacity to solve data-related challenges, anticipate future needs, and meet tight deadlines. - Must be able to work physically in the Bangalore office 4 days a week with the option to work remotely from home 1 day per week. Join us to leverage your expertise and drive data-driven insights and solutions in a dynamic and collaborative environment. If you are passionate about data architecture and eager to make a significant impact, we encourage you not to miss this opportunity.,

Posted 1 day ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Are you a driven individual looking to jumpstart your career in General Management? Look no further! AMRR TechSols Pvt Ltd is seeking a talented intern with expertise in MS-Office, MS-Excel, and strong English proficiency. As a General Management intern, you will have the opportunity to work closely with our leadership team and gain invaluable hands-on experience in various aspects of business operations. Join us and unleash your potential in a dynamic and fast-paced environment! Assist in creating and analyzing reports using MS-Excel Support day-to-day administrative tasks Communicate effectively with team members and clients Coordinate meetings and take detailed meeting minutes Assist in developing and implementing business strategies Conduct research on industry trends and competitors Collaborate with different departments to ensure smooth business operations If you are a proactive and ambitious individual with a passion for business management, we want to hear from you! Apply now and take the first step towards a successful career with AMRR TechSols Pvt Ltd. About Company: AMRR TechSols Pvt Ltd is a Bengaluru-based technology company founded in June 2022, specializing in delivering scalable, ready-to-integrate development teams for startups, growing businesses, and enterprises. The company offers customized solutions across a wide range of domains, including web and mobile development, AI and MLOps, cloud and DevOps, and ETL and data science. With expertise in MEAN, MERN, FastAPI, React, and Flutter, AMRR builds robust and scalable applications. Their AI and MLOps capabilities span PyTorch, TensorFlow, Scikit-Learn, MLflow, and Kubernetes, enabling advanced machine learning implementations. Leveraging tools like AWS, Azure, GCP, Docker, Kubernetes, and GitHub Actions, the company ensures streamlined deployments and enhanced scalability for its clients.

Posted 1 day ago

Apply

3.0 - 5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

About The Advanced Analytics Team The central Advanced Analytics team at the Abbott Established Pharma Division’s (EPD) headquarters in Basel helps define and lead the transformation towards becoming a global, data-driven company with the help of data and advanced technologies (e.g., Machine Learning, Deep Learning, Generative AI, Computer Vision). To us, Advanced Analytics is an important lever to reach our business targets, now and in the future; It helps differentiate ourselves from our competition and ensure sustainable revenue growth at optimal margins. Hence the central AA team is an integral part of the Strategy Management Office at EPD that has a very close link and regular interactions with the EPD Senior Leadership Team. Primary Job Function With the above requirements in mind, EPD is looking to fill a role of a Cloud Engineer reporting to the Head of AA Product Development. The Cloud Engineer will be responsible for developing applications leveraging AWS services. This role involves leading cloud initiatives, ensuring robust cloud infrastructure, and driving innovation in cloud technologies to support the business's advanced analytics needs. Core Job Responsibilities Support the development and maintenance of company-wide frameworks and libraries that enable faster, better, and more informed decision-making within the business, creating significant business value from data & analytics. Ensure data availability and accessibility for prioritized Advanced Analytics scope, and maintain stable, scalable, and modular data science pipelines from data exploration to deployment. Acquire, ingest, and process data from multiple sources and systems into our cloud platform (AWS), ensuring data integrity and security. Collaborate with data scientists to map data fields to hypotheses, and curate, wrangle, and prepare data for advanced analytical models. Implement and manage robust security measures to ensure compliant handling and management of data, including access strategies aligned with Information Security, Cyber Security, and Data Privacy principles. Develop and deploy smart automation tools based on cloud technologies, aligned with business priorities and needs. Oversee the timely delivery of Advanced Analytics solutions in coordination with the rest of the team and per requirements and timelines, ensuring alignment with business goals. Collaborate closely with the Data Science team and AI Engineers to understand platform needs and lead the development of solutions that support their work. Troubleshoot and resolve issues related to the AWS platform, ensuring minimal downtime and optimal performance. Define and document best practices and strategies regarding application deployment and infrastructure maintenance. Drive continuous improvement of the AWS Cloud platform by contributing and implementing new ideas and processes. Supervisory/Management Responsibilities Direct Reports: None. Indirect Reports: None. Position Accountability/Scope The Cloud Engineer is accountable for delivering targeted business impact per initiative in collaboration with key stakeholders. This role involves significant responsibility for the architecture and management of Abbott's strategic cloud platforms and AI/AA programs, enabling faster, better, and more informed decision-making within the business. Minimum Education Master in relevant field (e.g., computer science, electrical engineering) Minimum Experience/Training Required At least 3-5 years of relevant experience, with a strong track record in building solutions/applications using AWS services Proven ability to work across structured, semi-structured, and unstructured data, extracting information and identifying linkages across disparate data sets. Proficiency in multiple programming languages – Javascript, Python, Scala, PySpark or Java. Extensive knowledge and experience with various database technologies, including distributed processing frameworks, relational databases, MPP databases, and NoSQL data stores. Deep understanding of Information Security principles to ensure compliant handling and management of data. Significant experience with cloud platforms, preferably AWS and its ecosystem. Advanced knowledge of development in CICD (Continuous Integration and Continuous Delivery) environments. Strong background in data warehousing / ETL tools. Proficiency in DevOps practices and tools such as Jenkins, Terraform, etc. Proficiency in serverless architecture and services like AWS Lambda. Understanding of security best practices and implementation in cloud environments. Ability to understand business objectives and create cloud-based solutions to meet those objectives. Result-driven, analytical, and creative thinker. Proven ability to work with cross-functional teams and bridge the gap between business and data science. Fluency in English is a must; additional languages are a plus. Additional Technical Skills Experience with front-end frameworks preferably React JS. Knowledge of back-end frameworks like Django, Flask, or Node.js. Familiarity with database technologies such as RedShift, MySQL, or DynamoDB. Understanding of RESTful API design and development. Experience with version control systems like CodeCommit.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Who we are? Johnson Controls is the global leader for smart, healthy and sustainable buildings. At Johnson Controls, we’ve been making buildings smarter since 1885, and our capabilities, depth of innovation experience, and global reach have been growing ever since. Today, we offer the world’s largest portfolio of building products, technologies, software, and services; we put that portfolio to work to transform the environments where people live, work, learn and play. This is where Johnson Controls comes in, helping drive the outcomes that matter most. Through a full range of systems and digital solutions, we make your buildings smarter. A smarter building is safer, more comfortable, more efficient, and, ultimately, more sustainable. Most important, smarter buildings let you focus more intensely on your unique mission. Better for your people. Better for your bottom line. Better for the planet. We’re helping to create a healthy planet with solutions that decrease energy use, reduce waste and make carbon neutrality a reality. Sustainability is a top priority for our company. We committed to invest 75 percent of new product development R&D in climate-related innovation to develop sustainable products and services. We take sustainability seriously. Achieving net zero carbon emissions before 2040 is just one of our commitments to making the world a better place. Please visit and follow Johnson Controls LinkedIn for recent exciting activities. Why JCI: https://www.youtube.com/watch?v=nrbigjbpxkg Asia-Pacific LinkedIn: https://www.linkedin.com/showcase/johnson-controls-asia-pacific/posts/?feedView=all Career: The Power Behind Your Mission OpenBlue: This is How a Space Comes Alive How will you do it? Solution Architecture Design: Design scalable and efficient data architectures using Snowflake that meet business needs and best practices Implementation: Lead the deployment of Snowflake solutions, including data ingestion, transformation, and visualization processes Data Governance & Security: Ensuring compliance with global data regulations in accordance with the data strategy and cybersecurity initiatives Collaboration: Work closely with data engineers, data scientists, and business stakeholders to gather requirements and provide technical guidance Optimization: Monitor and optimize performance, storage, and cost of Snowflake environments, implementing best practices for data modeling and querying Integration: Integrate Snowflake with other cloud services and tools (e.g., ETL/ELT tools, BI tools, data lakes) to create seamless data workflows Documentation: Create and maintain documentation for architecture designs, data models, and operational procedures Training and Support: Provide training and support to teams on Snowflake usage and best practices Troubleshooting: Identify and resolve issues related to Snowflake performance, security, and data integrity Stay Updated: Keep abreast of Snowflake updates, new features, and industry trends to continually enhance solutions and methodologies Assist Data Architects in implementing Snowflake-based data warehouse solutions to support advanced analytics and reporting use cases What we look for? Minimum: Bachelor’s / Postgraduate/ Master’s Degree in any stream Minimum 5 years of relevant experience as Solutions Architect, Data Architect, or similar role Knowledge of Snowflake Data warehouse and understanding the concepts of data warehousing including ELT, ETL processes and data modelling Understanding of cloud platforms (AWS, Azure, GCP) and their integration with Snowflake Competency in data preparation and/or ETL tools to build and maintain data pipelines and flows Strong knowledge of databases, stored procedures(SPs) and optimization of large data sets SQL, Power BI/Tableau is mandatory along with knowledge of any data integration tool Excellent communication and collaboration skills Strong problem-solving abilities and analytical mindset Ability to work in a fast-paced, dynamic environment What We Offer We offer an exciting and challenging position. Joining us you will become part of a leading global multi-industrial corporation defined by its stimulating work environment and job satisfaction. In addition, we offer outstanding career development opportunities which will stretch your abilities and channel your talents Diversity & Inclusion Our dedication to diversity and inclusion starts with our values. We lead with integrity and purpose, focusing on the future and aligning with our customers’ vision for success. Our High-Performance Culture ensures that we have the best talent that is highly engaged and eager to innovate. Our D&I mission elevates each employee’s responsibility to contribute to our culture. It’s through these contributions that we’ll drive the mindsets and behaviors we need to power our customers’ missions. You have the power. You have the voice. You have the culture in your hands

Posted 1 day ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description We are seeking a highly analytical and detail-oriented data analytics expert to join our team in Noida. The ideal candidate will have strong experience in working with PostgreSQL, writing complex SQL queries and stored procedures, and creating impactful dashboards and reports using Power BI. Key Responsibilities Design, write, and optimize complex SQL queries, functions, and procedures using PostgreSQL. Analyze large datasets to extract insights and support business decision-making. Develop, maintain, and publish dynamic and interactive Power BI dashboards and reports. Collaborate with business and technical teams to understand data requirements and deliver analytics solutions. Ensure data accuracy, consistency, and performance optimization of analytical queries. Create documentation for data models, processes, and report Skills : Strong hands-on experience with PostgreSQL (advanced querying, indexing, procedures, performance tuning) Proficient in writing complex SQL for large and relational datasets Expertise in Power BI (data modeling, DAX, visualization best practices) Ability to translate business needs into data insights Good understanding of ETL processes and data pipelines (preferred but not mandatory) Experience working in Agile/Scrum teams is a Qualifications : Bachelor's degree in computer science, Information Technology, or a related field Strong problem-solving and communication skills Experience in integrating data from multiple sources (ref:hirist.tech)

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities. Responsibilities: Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Build and maintain ETL pipelines using Oracle DWH, SQL/PLSQL, and Big Data (HDFS, Spark) technologies. Design robust data models (star/snowflake) and ensure high-performance data architecture. Perform in-depth data analysis, profiling, and quality checks to support business needs. Collaborate with BI teams to deliver actionable insights via Tableau dashboards. Optimize data workflows, ensure scalability, and support cross-functional data initiatives. Qualifications: 6-10 years of relevant experience in Apps Development or systems analysis role Extensive experience system analysis and in programming of software applications Experience in managing and implementing successful projects Subject Matter Expert (SME) in at least one area of Applications Development Ability to adjust priorities quickly as circumstances dictate Demonstrated leadership and project management skills Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 day ago

Apply

3.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Job Description : Data Analyst (BI Developer) As a Data Analyst with an analytics engineering focus, you will be the bridge between our raw data and our business stakeholders. You won't just build dashboards; you will own the entire analytics workflow from modeling and transformation to visualization and deep-dive analysis. Using your expertise in SQL, Python, and modern BI tools, you will be responsible for creating and maintaining the trusted datasets that the entire company will rely on for decision-making. You will work closely with our Senior Data Engineer to leverage the data platform, ensuring that the data models you build are robust, reliable, and directly answer the most critical business questions. Key Responsibilities Data Modeling & Transformation : Use dbt to build, maintain, and document robust, reusable data models. You will own the "T" (Transform) in our ELT pipeline, turning raw data from our data lake into clean, trusted, and analysis-ready datasets. Business Intelligence & Dashboarding : Develop, deploy, and maintain insightful and intuitive dashboards using BI tools like Power BI, Tableau, or Metabase. You will be responsible for creating a single source of truth for key business metrics. Deep-Dive Analysis : Go beyond dashboards to answer complex business questions. Use your analytical skills in SQL and Python to conduct exploratory analysis, identify trends, and provide actionable recommendations to product, marketing, and leadership teams. Stakeholder Collaboration : Partner with business stakeholders to gather requirements, define key performance indicators (KPIs), and ensure your analytical outputs are aligned with their strategic goals. Data Quality & Documentation : Work with the Data Engineering team to define data quality tests within the transformation layer. Meticulously document your data models and metrics to foster a culture of data literacy. Required Skills & Experience (Must-Haves) 3+ years of experience in a data analyst, business intelligence, or analytics engineering role. Expert-level proficiency in SQL is absolutely essential. You should be comfortable with complex joins, window functions, and query optimization. Proven experience with a modern BI platform like Power BI, Tableau, Looker, or Metabase, from data source connection to final dashboard design. Hands-on experience with dbt for data modeling and transformation. You should understand dbt's core concepts and workflows. Proficiency in Python for data analysis and automation, specifically with libraries like Pandas, NumPy, and Matplotlib. Strong analytical and problem-solving skills, with a demonstrated ability to translate business questions into analytical work and analytical work into business insights. Excellent communication skills, with the ability to present complex data stories to a non-technical audience. Preferred Skills & Experience (Nice-to-Haves) Experience querying data in a cloud data warehouse or serverless query engine (e.g., AWS Athena, Google BigQuery, Azure Synapse, Snowflake). Familiarity with version control using Git. Experience working directly with data from NoSQL databases like MongoDB. A solid understanding of data engineering concepts (e.g., data lakes, ETL vs. ELT, data ingestion). Experience conducting statistical analysis (e.g., A/B testing, regression analysis). (ref:hirist.tech)

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

navi mumbai, maharashtra

On-site

As an experienced IT professional with over 5 years of experience, you should have a good understanding of analytics tools to effectively analyze data. Your previous roles may have involved working in production deployment and production support teams. You must be familiar with Big Data tools such as Hadoop, Spark, Apache Beam, and Kafka. Additionally, your expertise should include object-oriented/object function scripting languages like Python, Java, C++, and Scala. Experience with data warehousing tools like BQ, Redshift, Synapse, or Snowflake is essential. You should also be well-versed in ETL processes and have a strong understanding of relational and non-relational databases including MySQL, MS SQL Server, Postgres, MongoDB, and Cassandra. Familiarity with cloud platforms like AWS, GCP, and Azure is also required, along with experience in workflow management using tools like Apache Airflow. In your role, you will be expected to develop high-performance and scalable solutions using GCP for extracting, transforming, and loading big data. You will design and build production-grade data solutions from ingestion to consumption using Java or Python. Optimizing data models on GCP cloud with data stores such as BigQuery will be part of your responsibilities. Furthermore, you should be capable of handling the deployment process, optimizing data pipelines for performance and cost in large-scale data lakes, and writing complex queries across large data sets. Collaboration with Data Engineers to identify the right tools for delivering product features is essential, as well as researching new use cases for existing data. Preferred qualifications include awareness of design best practices for OLTP and OLAP systems, participation in team designing the database and pipeline, exposure to load testing methodologies, debugging pipelines, and handling delta loads in heterogeneous migration projects. Overall, you should be a collaborative team player who interacts effectively with business stakeholders, BAs, and other Data/ML engineers to drive innovation and deliver impactful solutions.,

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

About us: Credit Risk Technology Team is responsible for delivering Counterparty Credit Risk Management software solutions to Citi’s Risk organization and RWA integrity team for regulatory reporting which manages Citi’s exposure to financial institutions, governments and corporates that trade with Citi. The team builds and maintains software used to compute metrics that help mitigate Citi’s exposure to counterparty default. These include computation of Collateral Allocation for Portfolios, Haircut for Security and Cash Collateral, Collateral Concentration Levels and Wrong Way Risk, Pre-settlement exposure, Exposure At Default, Risk weighted assets amongst others. Technical Requirements: Object Orientated Design skills and SOLID principles Solid Knowledge of Core Java, J2EE Passion for technology and self- starter Orientation towards Disciplined development processes Core Java: Threading, Collections, Synchronization, Locking, annotations, Generics Java Frameworks such as Spring Core, Spring Batch, Hibernate, Webservices and Microservices. Able to write SQL Queries and PL/SQL to Analyze data Good knowledge of design patterns. UML Modeling Diagram Application server experience Build scripts like Ant, Maven Used any version Eclipse as development environment ETL, ELT and data warehousing concepts Experience: 8-12 Yrs Domain Experience: Banking & Finance (Preferred) Personal Skills The successful candidate must: Work to agreed deadlines as part of the remote development environment. Candidate should be able to manage and deliver with continuously changing requirements. Have experience of working co-operatively in small to medium sized teams. Be proactive and Self-motivated Be passionate about Java, J2EE technology environment Candidate should be a good problem solver Be able to understand human issues/sentiments and channelize them for better delivery Good design and coding discipline Team work Good written & verbal communication skills Ability to mentor junior team members Ability to troubleshoot conflicts and people issues The candidate will be expected to present documentation as proof of meeting these requirements. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies