Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. 🎯 Role Overview Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 week ago
6.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Title: Data Engineer ( GCP ) Employment Type: Full-time Work Mode: Work from Office Experience Required: 6+ Years Location: Ahmedabad / Gurugram Timing: General About the Role We are looking for an experienced Data Engineer to design, build, and optimize data systems. The ideal candidate will have strong expertise in Python, SQL, and cloud platforms, along with a passion for solving complex data challenges. Key Responsibilities Provide business analytics support to management. Analyze business results and design data collection studies. Build and maintain data pipelines and ETL processes using Python . Collaborate with analysts and data scientists to ensure data quality. Optimize database performance (indexing, partitioning, query optimization). Implement data governance and security measures. Monitor and troubleshoot data pipelines, ensuring validation and accuracy. Maintain documentation for workflows and processes. Skills Required Proficiency in Python and SQL . Experience with relational databases (MySQL, PostgreSQL, SQL Server). Knowledge of data modeling, data warehousing, and data architecture. Experience with cloud platforms ( GCP ). Proficiency in Google Cloud Platform (BigQuery, GCS) . Familiarity with version control ( Git ). What We Offer Competitive salary and industry-standard benefits. Opportunity to earn stock options in the near future. Career growth in cloud technologies with certifications in GCP . A chance to be part of a fast-growing and innovative team.
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title : PYTHON DEVELOPER WITH SQL AND ETL Key Skills : Python with Sql, Pyspark,Data bricks, ETL. Job Locations : Hyderabad , Pune , Bengaluru Experience :6-8 Education Qualification : Any Graduation. Work Mode : Hybrid. Employment Type : Contract. Notice Period : Immediate Job responsibilities: The candidate should have 4 yrs and above exp in Python development with SQL. Understanding of Pyspark,Data bricks . Passionate about ETL Development and problem solutions. Quickly learn new data tools and ideas Proficient in skills - Python with Sql,Pyspark,Data bricks,ETL ,AWS knowledge would be an added advantage. The candidate should be well aware of Data ways of working Knowledge of different application dev , understanding to data background.
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
punjab
On-site
The Associate Manager - BBS Analytics will be responsible for building Tableau Analytics Dashboards for multiple Global Internal Financial Controls Metrics. You will work with teams within Bunge Business Services to enable full visibility of Bunge's Internal Financial Controls. Your primary task will be to transform business and process data into actionable insights for business disclosures, decisions, and opportunities using data engineering and visualization tools, with a focus on expertise in visualization tool Tableau and Oracle SQL. You will be responsible for designing and delivering various reports, standard Tableau dashboards, ad hoc reports, templates, scorecards, and metrics to drive insights focused on business issues and priorities. Additionally, you will implement and automate business needs on the Online Business Intelligence tool for real-time Control effectiveness and efficiency analytics. It will be crucial for you to understand all aspects of Bunge's Control Metrics, especially reporting and compliance needs. You will collaborate with various stakeholders both internally and externally, with a strong emphasis on building partnerships and appropriately influencing to gain commitment. In this role, you will drive results through high standards, focus on key priorities, organization, and preparing others for change. Your technical skills should encompass a strong working knowledge of Accounting, ESG, Procurement, Agri contracts, SAP FICO/SD/MM, with business process knowledge of Finance Operations, business intelligence/reporting, data analysis and visualization. Additionally, you should have detailed knowledge and experience in BI, Reporting, Analysis, Data Visualization, and Visual Storytelling. The ability to make complex data science models and statistical inferences information clear and actionable will be essential. You should have extensive understanding of Controls Processes, Performance Metrics, and Governance, with significant experience driving large projects to successful completion. Being an Agile Practitioner and having Design Thinking expertise will be advantageous. Strong communication and presentation skills, collaboration skills, and integrity to hold self and others accountable to deliver against commitments are important attributes for this role. You will lead client engagements and oversee work-streams related to PTP, OTC, RTR. Additionally, you will develop solutions to customer challenges, identify gaps, and areas of improvement for dashboard building. Your responsibilities will include gathering requirements from functional stakeholders, conducting UAT with business users, working with Ops team to deploy the use case in production, and engaging with operations team to streamline and improve technical environment, access provisioning, and reporting processes. Managing engagement economics, project resources, team utilization, and delivering high-quality deliverables will be part of your role. You should have a strong competency in Tableau, Oracle, Python, R, MS Excel & PowerPoint and working knowledge of other enabling tools for a business services command center. Competencies in Data Analytics and Big Data tools and platforms will be beneficial. A relevant experience of 4 to 8 years with a Masters in Business Analytics, Finance, ESG, or Data Science from a premier institute/university will be preferred. Bunge (NYSE: BG) is a world leader in sourcing, processing, and supplying oilseed and grain products and ingredients. Founded in 1818, Bunge's expansive network feeds and fuels a growing world, creating sustainable products and opportunities for more than 70,000 farmers and the consumers they serve across the globe. The company is headquartered in St. Louis, Missouri and has 25,000 employees worldwide who stand behind more than 350 port terminals, oilseed processing plants, grain facilities, and food and ingredient production and packaging facilities around the world.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
maharashtra
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your role will involve 4-5 years of ETL testing and data validation. You should be experienced in ETL Testing, Requirement Analysis, Test Planning, Data Validation, and Defect Tracking. Proficiency in SQL is required for writing complex queries to validate data. Additionally, knowledge of ETL tools, experience with data warehousing concepts and methodologies, and strong analytical and problem-solving skills are desirable. Exposure to Agile/Scrum methodologies and experience in AWS, PySpark, Databricks, or any other cloud-based test execution would be beneficial. Your profile should include experience in ETL Testing, Requirement Analysis, Test Planning, Data Validation, and Defect Tracking. Proficiency in SQL for writing complex queries to validate data is essential. At Capgemini, you will have the opportunity to make a difference for the world's leading businesses or for society. You will receive the support needed to shape your career in a way that works for you. When the future doesn't look as bright as you'd like, you will have the opportunity to make a change and rewrite it. By joining Capgemini, you become part of a diverse collective of free-thinkers, entrepreneurs, and experts all working together to unleash human energy through technology for an inclusive and sustainable future. Capgemini values its people and offers extensive Learning & Development programs for career growth. The work environment is inclusive, safe, healthy, and flexible to bring out the best in you. You can also take an active role in Corporate Social Responsibility and Sustainability initiatives to make a positive social change and build a better world. Capgemini is a global business and technology transformation partner with over 55 years of heritage, trusted by clients to unlock the value of technology and address their business needs. The company has a diverse group of 340,000 team members in more than 50 countries. Capgemini delivers end-to-end services and solutions leveraging AI, cloud, data, and deep industry expertise to create tangible impact for enterprises and society. The Group reported 2023 global revenues of 22.5 billion. Skills required for this role include SQL, ETL, Python, Scala, and SQL + ETL.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will also be required to monitor and control all phases of the development process, provide user and operational support on applications to business users, and recommend and develop security measures in post-implementation analysis. As the Applications Development Senior Programmer Analyst, you will utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, evaluate business and system processes, recommend advanced programming solutions, and ensure that essential procedures are followed. Additionally, you will serve as an advisor or coach to new or lower-level analysts, operate with a limited level of direct supervision, and act as a subject matter expert to senior stakeholders and other team members. To qualify for this role, you should have 8-12 years of relevant experience in systems analysis and programming of software applications, managing and implementing successful projects, and working knowledge of consulting/project management techniques/methods. You should also have the ability to work under pressure, manage deadlines, and adapt to unexpected changes in expectations or requirements. A Bachelor's degree or equivalent experience is required for this position. In addition to the general job description, the ideal candidate should have 8 to 12 years of Application development experience through the full lifecycle with expertise in UI architecture patterns such as Micro Frontend and NX. Proficiency in Core Java/J2EE Application, Data Structures, Algorithms, Hadoop, Map Reduce Framework, Spark, YARN, and other relevant technologies is essential. Experience with Big Data Spark ecosystem, ETL, BI tools, agile environment, test-driven development, and optimizing software solutions for performance and stability is also preferred. This job description provides an overview of the responsibilities and qualifications for the Applications Development Senior Programmer Analyst role. Other job-related duties may be assigned as required.,
Posted 1 week ago
5.0 years
0 Lacs
Haryana, India
On-site
Senior Data Engineer (C11) Analytics & Information Management (AIM), Gurugram Excited to grow your career? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position, you see is right for you, we encourage you to apply! Our people make all the difference in our success. We are seeking a highly experienced and strategic Officer – Sr. Data Engineer for Data/Information Management Team. The ideal candidate will be responsible for development and implementation of data analytics solutions to support key business objectives for Legal Operations as part of COO (Chief Operating Office). This role requires proven track record of implementing optimized data processes/platforms, delivering impactful insights, and fostering a data-driven culture. ------------------------------------------------------ The Data/Information Analyst accomplishes results by contributing significantly to the bank's success by leveraging data engineering & solution design skills within specialized domain. Integrates subject matter and industry expertise within a defined area. Contributes to standards around which others will operate. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the entire function. Requires basic commercial awareness. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Has responsibility for volume, quality, timeliness and delivery of end results of an area. Responsibilities: Incumbents would be primarily responsible for supporting Business Execution activities Chief Operating Office, implement data engineering solutions to manage banking operations. Establish monitoring routines, scorecards and escalation workflows Oversee the Data Strategy, Smart Automation, Insight Generation, Data Quality and Reporting activities using proven analytical techniques. Responsible for documenting data requirements, data collection / processing / cleaning, which may include Process Automation / Optimization and data visualization techniques. Enable proactive issue detection, escalation workflows, and alignment with firmwide Data Related policies, Implement a governance framework with clear stewardship roles and data quality controls Interface between business and technology partners for digitizing data collection, including performance generation, validation rules for banking operations. Build Data Strategy by identifying all relevant product processors, create Data Lake, Data Pipeline, Governance & Reporting Communicate findings and recommendations to senior management. Stay current with the latest trends and technologies in analytics. Ensure compliance with data governance policies and regulatory requirements. Setup a governance operating framework to enable operationalization of data domains, identify CDEs and Data Quality rules. Align with Citi Data Governance Policies and firmwide Chief Data Office expectations Incumbents work with large and complex data sets (both internal and external data) to evaluate, recommend, and support the implementation of business strategies like Centralized data repository with standardized definitions and scalable data pipes Identifies and compiles data sets using a variety of tools (e.g. SQL, Access) to help predict, improve, and measure the success of key business to business outcomes. Implement rule-based Data Quality checks across critical data points. Automate alerts for breaks and publish periodic quality reports Incumbents in this role may often be referred to as Data Analyst. Develop and execute the analytics strategy – Data Ingestion, Reporting / Insights Centralization, Ensure consistency, lineage tracking, and audit readiness across legal reporting Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency, as well as effectively supervise the activity of others and create accountability with those who fail to maintain these standards. Work as a senior member in a team of data engineering professionals, working with them to deliver on organizational priorities Qualifications: 5+ years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like Python, PySpark, Tableau, MicroStrategy, SQL etc. Strong understanding of Data Transformation – Data Strategy, Data Architecture, Data Tracing & Lineage (ability to trace data lineage from source systems to data warehouse to reports and dashboards), Scalable Data Flow Design and Standardization, Platform Integration, ETL & Smart Automation Conceptual, logical, and physical data modeling expertise. Proficiency in relational and dimensional data modeling techniques. Ability and experience in designing data warehouses, integrated data marts, and optimized reporting schemas that cater to multiple BI tools Database Management & Optimization. Expertise in database performance tuning and optimization for data enrichment and integration, reporting and dashboarding Strong understanding of data platforms / ecosystem, establish a scalable data management framework – data provisioning, process optimization, actionable insights, visualization techniques using Tableau Solution Architect with proven ability to translate complex data flows into automated & optimized solutions. Ability to leverage data analytics tools & techniques for analytics problem solving for organizational needs Experience in Developing and Deploying AI solutions in partnership with Tech and Business Experience with any banking operations (e.g., expense analytics, movement of funds, cash flow management, fraud analytics, ROI). Knowledge of regulatory requirements related to data privacy and security Experience in interacting with senior stakeholders across the organization to be able to manage end-to-end conceptualization & implementation of data strategies - standardization data structures, identify and remove redundancies to optimize data feeds AI / Gen AI proficiency and thought leadership in Financial/Business Analysis and/or credit/risk analysis with ability to impact key business drivers via a disciplined analytic process Demonstrate Analytics thought leadership skills & project planning capabilities In-depth understanding of the various financial service business models, expert knowledge of advanced statistical techniques and how to apply the techniques to drive substantial business results Creative problem-solving skills Education: Bachelors/University degree in STEM, Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Time Type :Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Job Level :C11 ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Data/Information Management ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills MicroStrategy, Python (Programming Language), Structured Query Language (SQL), Tableau (Software). ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
5.0 years
0 Lacs
Haryana, India
On-site
Excited to grow your career? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position, you see is right for you, we encourage you to apply! Our people make all the difference in our success. We are seeking a highly experienced and strategic Officer – Sr. Data Engineer for Data/Information Management Team. The ideal candidate will be responsible for development and implementation of data analytics solutions to support key business objectives for Legal Operations as part of COO (Chief Operating Office). This role requires proven track record of implementing optimized data processes/platforms, delivering impactful insights, and fostering a data-driven culture. ------------------------------------------------------ The Data/Information Analyst accomplishes results by contributing significantly to the bank's success by leveraging data engineering & solution design skills within specialized domain. Integrates subject matter and industry expertise within a defined area. Contributes to standards around which others will operate. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the entire function. Requires basic commercial awareness. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Has responsibility for volume, quality, timeliness and delivery of end results of an area. Responsibilities: Incumbents would be primarily responsible for supporting Business Execution activities Chief Operating Office, implement data engineering solutions to manage banking operations. Establish monitoring routines, scorecards and escalation workflows Oversee the Data Strategy, Smart Automation, Insight Generation, Data Quality and Reporting activities using proven analytical techniques. Responsible for documenting data requirements, data collection / processing / cleaning, which may include Process Automation / Optimization and data visualization techniques. Enable proactive issue detection, escalation workflows, and alignment with firmwide Data Related policies, Implement a governance framework with clear stewardship roles and data quality controls Interface between business and technology partners for digitizing data collection, including performance generation, validation rules for banking operations. Build Data Strategy by identifying all relevant product processors, create Data Lake, Data Pipeline, Governance & Reporting Communicate findings and recommendations to senior management. Stay current with the latest trends and technologies in analytics. Ensure compliance with data governance policies and regulatory requirements. Setup a governance operating framework to enable operationalization of data domains, identify CDEs and Data Quality rules. Align with Citi Data Governance Policies and firmwide Chief Data Office expectations Incumbents work with large and complex data sets (both internal and external data) to evaluate, recommend, and support the implementation of business strategies like Centralized data repository with standardized definitions and scalable data pipes Identifies and compiles data sets using a variety of tools (e.g. SQL, Access) to help predict, improve, and measure the success of key business to business outcomes. Implement rule-based Data Quality checks across critical data points. Automate alerts for breaks and publish periodic quality reports Incumbents in this role may often be referred to as Data Analyst. Develop and execute the analytics strategy – Data Ingestion, Reporting / Insights Centralization, Ensure consistency, lineage tracking, and audit readiness across legal reporting Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency, as well as effectively supervise the activity of others and create accountability with those who fail to maintain these standards. Work as a senior member in a team of data engineering professionals, working with them to deliver on organizational priorities Qualifications: 5+ years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like Python, PySpark, Tableau, MicroStrategy, SQL etc. Strong understanding of Data Transformation – Data Strategy, Data Architecture, Data Tracing & Lineage (ability to trace data lineage from source systems to data warehouse to reports and dashboards), Scalable Data Flow Design and Standardization, Platform Integration, ETL & Smart Automation Conceptual, logical, and physical data modeling expertise. Proficiency in relational and dimensional data modeling techniques. Ability and experience in designing data warehouses, integrated data marts, and optimized reporting schemas that cater to multiple BI tools Database Management & Optimization. Expertise in database performance tuning and optimization for data enrichment and integration, reporting and dashboarding Strong understanding of data platforms / ecosystem, establish a scalable data management framework – data provisioning, process optimization, actionable insights, visualization techniques using Tableau Solution Architect with proven ability to translate complex data flows into automated & optimized solutions. Ability to leverage data analytics tools & techniques for analytics problem solving for organizational needs Experience in Developing and Deploying AI solutions in partnership with Tech and Business Experience with any banking operations (e.g., expense analytics, movement of funds, cash flow management, fraud analytics, ROI). Knowledge of regulatory requirements related to data privacy and security Experience in interacting with senior stakeholders across the organization to be able to manage end-to-end conceptualization & implementation of data strategies - standardization data structures, identify and remove redundancies to optimize data feeds AI / Gen AI proficiency and thought leadership in Financial/Business Analysis and/or credit/risk analysis with ability to impact key business drivers via a disciplined analytic process Demonstrate Analytics thought leadership skills & project planning capabilities In-depth understanding of the various financial service business models, expert knowledge of advanced statistical techniques and how to apply the techniques to drive substantial business results Creative problem-solving skills Education: Bachelors/University degree in STEM, Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Time Type :Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Job Level :C11 ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Data/Information Management ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills MicroStrategy, PySpark, Python (Programming Language), Structured Query Language (SQL). ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
The Finance Data & Insights Team is an agile product team responsible for the development, production, and transformation of Financial data and reporting across Consumer and Community Banking. The vision of the team is to enhance the lives of the people and increase value to the firm by harnessing the power of data and utilizing the best tools to analyze data, generate insights, save time, improve processes & control, and lead the organization in developing skills for the future. The overall product objectives include constructing a data environment that enables cross-business, product, customer-centric decision making and reporting needs across Consumer and Community Banking in a consistent framework, creating an ecosystem of dashboards sourced from authoritative data sources to replace manual management reporting, and eliminating user tools to save time and increase the ability to generate insights through data and dashboards. As a Data Domain Architect in CBB- field performance area, you will play a key role in bringing transformation in how Financial, Operational, and Behavioral Data are reported and analyzed across Consumer Banking and Business Banking within CCB. Your responsibilities will include but are not limited to discovering, sourcing, designing, and delivering data domains into the Databricks powered Data Mart, enabling the Finance function to support their analytical and reporting needs. You will need to understand the needs of Finance and make data discoverable and available for analytical and reporting purposes. Job Responsibilities: - Conduct comprehensive data discoveries, sourcing, and maintenance of Financial and Operational data to support Field Performance Reporting and analytical needs. - Develop detailed Data Requirement documentation aligned with overall data strategies and models. - Collaborate with the Technology team to develop and test data wrangling workflows, ensuring validation of business logic and outcomes. - Perform integration and regression testing of data components, ensuring compliance and control measures. - Proactively identify and resolve issues/challenges, highlighting potential risks to leadership. - Engage closely with end-users and IT during the UAT phase to validate that production results meet business requirements. - Serve as a subject matter expert in relevant areas, providing support and guidance to other team members. Required qualifications, capabilities, and skills: - Bachelor's degree in MIS, Computer Science, Mathematics, Engineering, Statistics, or a related quantitative field. - Over 10 years of experience in financial solutions, data engineering, data science, or business intelligence within the financial services domain. - Proven experience in building data models that accurately represent business requirements and ensure data integrity, with a strong understanding of data governance principles. - Expertise in database queries including SQL and NoSQL, proficient in ETL techniques. - Solid understanding of data warehousing concepts, design principles, reporting development, and testing. - Deep industry or business domain knowledge relevant to the organization, proficiency in tools such as Databricks/Snowflake, Alteryx, Tableau/ThoughtSpot. - Awareness of technologies and frameworks for handling large data volumes and familiarity with analytical tools. - Ability to think beyond raw data, understand the business context, and identify business opportunities within data. - Strong written and oral communication skills, ability to engage stakeholders across technology, data, and business functions. - Capacity to solve data-related challenges, anticipate future needs, and meet tight deadlines. - Must be able to work physically in the Bangalore office 4 days a week with the option to work remotely from home 1 day per week. Join us to leverage your expertise and drive data-driven insights and solutions in a dynamic and collaborative environment. If you are passionate about data architecture and eager to make a significant impact, we encourage you not to miss this opportunity.,
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Are you a driven individual looking to jumpstart your career in General Management? Look no further! AMRR TechSols Pvt Ltd is seeking a talented intern with expertise in MS-Office, MS-Excel, and strong English proficiency. As a General Management intern, you will have the opportunity to work closely with our leadership team and gain invaluable hands-on experience in various aspects of business operations. Join us and unleash your potential in a dynamic and fast-paced environment! Assist in creating and analyzing reports using MS-Excel Support day-to-day administrative tasks Communicate effectively with team members and clients Coordinate meetings and take detailed meeting minutes Assist in developing and implementing business strategies Conduct research on industry trends and competitors Collaborate with different departments to ensure smooth business operations If you are a proactive and ambitious individual with a passion for business management, we want to hear from you! Apply now and take the first step towards a successful career with AMRR TechSols Pvt Ltd. About Company: AMRR TechSols Pvt Ltd is a Bengaluru-based technology company founded in June 2022, specializing in delivering scalable, ready-to-integrate development teams for startups, growing businesses, and enterprises. The company offers customized solutions across a wide range of domains, including web and mobile development, AI and MLOps, cloud and DevOps, and ETL and data science. With expertise in MEAN, MERN, FastAPI, React, and Flutter, AMRR builds robust and scalable applications. Their AI and MLOps capabilities span PyTorch, TensorFlow, Scikit-Learn, MLflow, and Kubernetes, enabling advanced machine learning implementations. Leveraging tools like AWS, Azure, GCP, Docker, Kubernetes, and GitHub Actions, the company ensures streamlined deployments and enhanced scalability for its clients.
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
About The Advanced Analytics Team The central Advanced Analytics team at the Abbott Established Pharma Division’s (EPD) headquarters in Basel helps define and lead the transformation towards becoming a global, data-driven company with the help of data and advanced technologies (e.g., Machine Learning, Deep Learning, Generative AI, Computer Vision). To us, Advanced Analytics is an important lever to reach our business targets, now and in the future; It helps differentiate ourselves from our competition and ensure sustainable revenue growth at optimal margins. Hence the central AA team is an integral part of the Strategy Management Office at EPD that has a very close link and regular interactions with the EPD Senior Leadership Team. Primary Job Function With the above requirements in mind, EPD is looking to fill a role of a Cloud Engineer reporting to the Head of AA Product Development. The Cloud Engineer will be responsible for developing applications leveraging AWS services. This role involves leading cloud initiatives, ensuring robust cloud infrastructure, and driving innovation in cloud technologies to support the business's advanced analytics needs. Core Job Responsibilities Support the development and maintenance of company-wide frameworks and libraries that enable faster, better, and more informed decision-making within the business, creating significant business value from data & analytics. Ensure data availability and accessibility for prioritized Advanced Analytics scope, and maintain stable, scalable, and modular data science pipelines from data exploration to deployment. Acquire, ingest, and process data from multiple sources and systems into our cloud platform (AWS), ensuring data integrity and security. Collaborate with data scientists to map data fields to hypotheses, and curate, wrangle, and prepare data for advanced analytical models. Implement and manage robust security measures to ensure compliant handling and management of data, including access strategies aligned with Information Security, Cyber Security, and Data Privacy principles. Develop and deploy smart automation tools based on cloud technologies, aligned with business priorities and needs. Oversee the timely delivery of Advanced Analytics solutions in coordination with the rest of the team and per requirements and timelines, ensuring alignment with business goals. Collaborate closely with the Data Science team and AI Engineers to understand platform needs and lead the development of solutions that support their work. Troubleshoot and resolve issues related to the AWS platform, ensuring minimal downtime and optimal performance. Define and document best practices and strategies regarding application deployment and infrastructure maintenance. Drive continuous improvement of the AWS Cloud platform by contributing and implementing new ideas and processes. Supervisory/Management Responsibilities Direct Reports: None. Indirect Reports: None. Position Accountability/Scope The Cloud Engineer is accountable for delivering targeted business impact per initiative in collaboration with key stakeholders. This role involves significant responsibility for the architecture and management of Abbott's strategic cloud platforms and AI/AA programs, enabling faster, better, and more informed decision-making within the business. Minimum Education Master in relevant field (e.g., computer science, electrical engineering) Minimum Experience/Training Required At least 3-5 years of relevant experience, with a strong track record in building solutions/applications using AWS services Proven ability to work across structured, semi-structured, and unstructured data, extracting information and identifying linkages across disparate data sets. Proficiency in multiple programming languages – Javascript, Python, Scala, PySpark or Java. Extensive knowledge and experience with various database technologies, including distributed processing frameworks, relational databases, MPP databases, and NoSQL data stores. Deep understanding of Information Security principles to ensure compliant handling and management of data. Significant experience with cloud platforms, preferably AWS and its ecosystem. Advanced knowledge of development in CICD (Continuous Integration and Continuous Delivery) environments. Strong background in data warehousing / ETL tools. Proficiency in DevOps practices and tools such as Jenkins, Terraform, etc. Proficiency in serverless architecture and services like AWS Lambda. Understanding of security best practices and implementation in cloud environments. Ability to understand business objectives and create cloud-based solutions to meet those objectives. Result-driven, analytical, and creative thinker. Proven ability to work with cross-functional teams and bridge the gap between business and data science. Fluency in English is a must; additional languages are a plus. Additional Technical Skills Experience with front-end frameworks preferably React JS. Knowledge of back-end frameworks like Django, Flask, or Node.js. Familiarity with database technologies such as RedShift, MySQL, or DynamoDB. Understanding of RESTful API design and development. Experience with version control systems like CodeCommit.
Posted 1 week ago
5.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who we are? Johnson Controls is the global leader for smart, healthy and sustainable buildings. At Johnson Controls, we’ve been making buildings smarter since 1885, and our capabilities, depth of innovation experience, and global reach have been growing ever since. Today, we offer the world’s largest portfolio of building products, technologies, software, and services; we put that portfolio to work to transform the environments where people live, work, learn and play. This is where Johnson Controls comes in, helping drive the outcomes that matter most. Through a full range of systems and digital solutions, we make your buildings smarter. A smarter building is safer, more comfortable, more efficient, and, ultimately, more sustainable. Most important, smarter buildings let you focus more intensely on your unique mission. Better for your people. Better for your bottom line. Better for the planet. We’re helping to create a healthy planet with solutions that decrease energy use, reduce waste and make carbon neutrality a reality. Sustainability is a top priority for our company. We committed to invest 75 percent of new product development R&D in climate-related innovation to develop sustainable products and services. We take sustainability seriously. Achieving net zero carbon emissions before 2040 is just one of our commitments to making the world a better place. Please visit and follow Johnson Controls LinkedIn for recent exciting activities. Why JCI: https://www.youtube.com/watch?v=nrbigjbpxkg Asia-Pacific LinkedIn: https://www.linkedin.com/showcase/johnson-controls-asia-pacific/posts/?feedView=all Career: The Power Behind Your Mission OpenBlue: This is How a Space Comes Alive How will you do it? Solution Architecture Design: Design scalable and efficient data architectures using Snowflake that meet business needs and best practices Implementation: Lead the deployment of Snowflake solutions, including data ingestion, transformation, and visualization processes Data Governance & Security: Ensuring compliance with global data regulations in accordance with the data strategy and cybersecurity initiatives Collaboration: Work closely with data engineers, data scientists, and business stakeholders to gather requirements and provide technical guidance Optimization: Monitor and optimize performance, storage, and cost of Snowflake environments, implementing best practices for data modeling and querying Integration: Integrate Snowflake with other cloud services and tools (e.g., ETL/ELT tools, BI tools, data lakes) to create seamless data workflows Documentation: Create and maintain documentation for architecture designs, data models, and operational procedures Training and Support: Provide training and support to teams on Snowflake usage and best practices Troubleshooting: Identify and resolve issues related to Snowflake performance, security, and data integrity Stay Updated: Keep abreast of Snowflake updates, new features, and industry trends to continually enhance solutions and methodologies Assist Data Architects in implementing Snowflake-based data warehouse solutions to support advanced analytics and reporting use cases What we look for? Minimum: Bachelor’s / Postgraduate/ Master’s Degree in any stream Minimum 5 years of relevant experience as Solutions Architect, Data Architect, or similar role Knowledge of Snowflake Data warehouse and understanding the concepts of data warehousing including ELT, ETL processes and data modelling Understanding of cloud platforms (AWS, Azure, GCP) and their integration with Snowflake Competency in data preparation and/or ETL tools to build and maintain data pipelines and flows Strong knowledge of databases, stored procedures(SPs) and optimization of large data sets SQL, Power BI/Tableau is mandatory along with knowledge of any data integration tool Excellent communication and collaboration skills Strong problem-solving abilities and analytical mindset Ability to work in a fast-paced, dynamic environment What We Offer We offer an exciting and challenging position. Joining us you will become part of a leading global multi-industrial corporation defined by its stimulating work environment and job satisfaction. In addition, we offer outstanding career development opportunities which will stretch your abilities and channel your talents Diversity & Inclusion Our dedication to diversity and inclusion starts with our values. We lead with integrity and purpose, focusing on the future and aligning with our customers’ vision for success. Our High-Performance Culture ensures that we have the best talent that is highly engaged and eager to innovate. Our D&I mission elevates each employee’s responsibility to contribute to our culture. It’s through these contributions that we’ll drive the mindsets and behaviors we need to power our customers’ missions. You have the power. You have the voice. You have the culture in your hands
Posted 1 week ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description We are seeking a highly analytical and detail-oriented data analytics expert to join our team in Noida. The ideal candidate will have strong experience in working with PostgreSQL, writing complex SQL queries and stored procedures, and creating impactful dashboards and reports using Power BI. Key Responsibilities Design, write, and optimize complex SQL queries, functions, and procedures using PostgreSQL. Analyze large datasets to extract insights and support business decision-making. Develop, maintain, and publish dynamic and interactive Power BI dashboards and reports. Collaborate with business and technical teams to understand data requirements and deliver analytics solutions. Ensure data accuracy, consistency, and performance optimization of analytical queries. Create documentation for data models, processes, and report Skills : Strong hands-on experience with PostgreSQL (advanced querying, indexing, procedures, performance tuning) Proficient in writing complex SQL for large and relational datasets Expertise in Power BI (data modeling, DAX, visualization best practices) Ability to translate business needs into data insights Good understanding of ETL processes and data pipelines (preferred but not mandatory) Experience working in Agile/Scrum teams is a Qualifications : Bachelor's degree in computer science, Information Technology, or a related field Strong problem-solving and communication skills Experience in integrating data from multiple sources (ref:hirist.tech)
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Technology Lead Analyst is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to lead applications systems analysis and programming activities. Responsibilities: Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Build and maintain ETL pipelines using Oracle DWH, SQL/PLSQL, and Big Data (HDFS, Spark) technologies. Design robust data models (star/snowflake) and ensure high-performance data architecture. Perform in-depth data analysis, profiling, and quality checks to support business needs. Collaborate with BI teams to deliver actionable insights via Tableau dashboards. Optimize data workflows, ensure scalability, and support cross-functional data initiatives. Qualifications: 6-10 years of relevant experience in Apps Development or systems analysis role Extensive experience system analysis and in programming of software applications Experience in managing and implementing successful projects Subject Matter Expert (SME) in at least one area of Applications Development Ability to adjust priorities quickly as circumstances dictate Demonstrated leadership and project management skills Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
3.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Job Description : Data Analyst (BI Developer) As a Data Analyst with an analytics engineering focus, you will be the bridge between our raw data and our business stakeholders. You won't just build dashboards; you will own the entire analytics workflow from modeling and transformation to visualization and deep-dive analysis. Using your expertise in SQL, Python, and modern BI tools, you will be responsible for creating and maintaining the trusted datasets that the entire company will rely on for decision-making. You will work closely with our Senior Data Engineer to leverage the data platform, ensuring that the data models you build are robust, reliable, and directly answer the most critical business questions. Key Responsibilities Data Modeling & Transformation : Use dbt to build, maintain, and document robust, reusable data models. You will own the "T" (Transform) in our ELT pipeline, turning raw data from our data lake into clean, trusted, and analysis-ready datasets. Business Intelligence & Dashboarding : Develop, deploy, and maintain insightful and intuitive dashboards using BI tools like Power BI, Tableau, or Metabase. You will be responsible for creating a single source of truth for key business metrics. Deep-Dive Analysis : Go beyond dashboards to answer complex business questions. Use your analytical skills in SQL and Python to conduct exploratory analysis, identify trends, and provide actionable recommendations to product, marketing, and leadership teams. Stakeholder Collaboration : Partner with business stakeholders to gather requirements, define key performance indicators (KPIs), and ensure your analytical outputs are aligned with their strategic goals. Data Quality & Documentation : Work with the Data Engineering team to define data quality tests within the transformation layer. Meticulously document your data models and metrics to foster a culture of data literacy. Required Skills & Experience (Must-Haves) 3+ years of experience in a data analyst, business intelligence, or analytics engineering role. Expert-level proficiency in SQL is absolutely essential. You should be comfortable with complex joins, window functions, and query optimization. Proven experience with a modern BI platform like Power BI, Tableau, Looker, or Metabase, from data source connection to final dashboard design. Hands-on experience with dbt for data modeling and transformation. You should understand dbt's core concepts and workflows. Proficiency in Python for data analysis and automation, specifically with libraries like Pandas, NumPy, and Matplotlib. Strong analytical and problem-solving skills, with a demonstrated ability to translate business questions into analytical work and analytical work into business insights. Excellent communication skills, with the ability to present complex data stories to a non-technical audience. Preferred Skills & Experience (Nice-to-Haves) Experience querying data in a cloud data warehouse or serverless query engine (e.g., AWS Athena, Google BigQuery, Azure Synapse, Snowflake). Familiarity with version control using Git. Experience working directly with data from NoSQL databases like MongoDB. A solid understanding of data engineering concepts (e.g., data lakes, ETL vs. ELT, data ingestion). Experience conducting statistical analysis (e.g., A/B testing, regression analysis). (ref:hirist.tech)
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As an experienced IT professional with over 5 years of experience, you should have a good understanding of analytics tools to effectively analyze data. Your previous roles may have involved working in production deployment and production support teams. You must be familiar with Big Data tools such as Hadoop, Spark, Apache Beam, and Kafka. Additionally, your expertise should include object-oriented/object function scripting languages like Python, Java, C++, and Scala. Experience with data warehousing tools like BQ, Redshift, Synapse, or Snowflake is essential. You should also be well-versed in ETL processes and have a strong understanding of relational and non-relational databases including MySQL, MS SQL Server, Postgres, MongoDB, and Cassandra. Familiarity with cloud platforms like AWS, GCP, and Azure is also required, along with experience in workflow management using tools like Apache Airflow. In your role, you will be expected to develop high-performance and scalable solutions using GCP for extracting, transforming, and loading big data. You will design and build production-grade data solutions from ingestion to consumption using Java or Python. Optimizing data models on GCP cloud with data stores such as BigQuery will be part of your responsibilities. Furthermore, you should be capable of handling the deployment process, optimizing data pipelines for performance and cost in large-scale data lakes, and writing complex queries across large data sets. Collaboration with Data Engineers to identify the right tools for delivering product features is essential, as well as researching new use cases for existing data. Preferred qualifications include awareness of design best practices for OLTP and OLAP systems, participation in team designing the database and pipeline, exposure to load testing methodologies, debugging pipelines, and handling delta loads in heterogeneous migration projects. Overall, you should be a collaborative team player who interacts effectively with business stakeholders, BAs, and other Data/ML engineers to drive innovation and deliver impactful solutions.,
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
Remote
About us: Credit Risk Technology Team is responsible for delivering Counterparty Credit Risk Management software solutions to Citi’s Risk organization and RWA integrity team for regulatory reporting which manages Citi’s exposure to financial institutions, governments and corporates that trade with Citi. The team builds and maintains software used to compute metrics that help mitigate Citi’s exposure to counterparty default. These include computation of Collateral Allocation for Portfolios, Haircut for Security and Cash Collateral, Collateral Concentration Levels and Wrong Way Risk, Pre-settlement exposure, Exposure At Default, Risk weighted assets amongst others. Technical Requirements: Object Orientated Design skills and SOLID principles Solid Knowledge of Core Java, J2EE Passion for technology and self- starter Orientation towards Disciplined development processes Core Java: Threading, Collections, Synchronization, Locking, annotations, Generics Java Frameworks such as Spring Core, Spring Batch, Hibernate, Webservices and Microservices. Able to write SQL Queries and PL/SQL to Analyze data Good knowledge of design patterns. UML Modeling Diagram Application server experience Build scripts like Ant, Maven Used any version Eclipse as development environment ETL, ELT and data warehousing concepts Experience: 8-12 Yrs Domain Experience: Banking & Finance (Preferred) Personal Skills The successful candidate must: Work to agreed deadlines as part of the remote development environment. Candidate should be able to manage and deliver with continuously changing requirements. Have experience of working co-operatively in small to medium sized teams. Be proactive and Self-motivated Be passionate about Java, J2EE technology environment Candidate should be a good problem solver Be able to understand human issues/sentiments and channelize them for better delivery Good design and coding discipline Team work Good written & verbal communication skills Ability to mentor junior team members Ability to troubleshoot conflicts and people issues The candidate will be expected to present documentation as proof of meeting these requirements. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
4.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
This is a full-time on-site role for a SQL Developer located in Noida. As a SQL Developer, you will be responsible for database development, ETL (Extract Transform Load), database design, analytical skills, and data modeling on a day-to-day basis. You should possess a Bachelor's degree or equivalent in Computer Science or a related field, along with at least 4-10 years of industry experience. Experience in working with SQL relational database management systems is essential, and SQL scripting knowledge would be a plus. Your responsibilities will include creating interfaces for upstream/downstream applications, designing, building, testing, deploying, and scheduling the Integration process involving third party systems. In this role, you will be involved in designing and developing integrations using Boomi AtomSphere integration platform or Workforce Integration Manager or similar Integration Tools. Knowledge of Rest API, SOAP framework, XML, and Web service design would be beneficial. Strong oral and written communication skills, as well as good customer interfacing skills, are required for this position. Other responsibilities will include implementing software in various environments using Professional Services concepts, following the SDLC process to provide solutions for Interfaces, understanding client requirements, preparing design documents, coding, testing, deploying interfaces, providing User Acceptance Testing support, deploying and releasing to production environment, and handing off to global support.,
Posted 1 week ago
7.0 years
0 Lacs
Greater Kolkata Area
On-site
Who are we looking for? We are looking for 7+ years of administrator experience in MongoDB/Cassandra/Snowflake Databases. This role is focused on production support, ensuring database performance, availability, and reliability across multiple clusters. The ideal candidate will be responsible for ensuring the availability, performance, and security of our NoSQL database environment. You will provide 24/7 production support, troubleshoot issues, monitor system health, optimize performance, and collaborate with cross-functional teams to maintain a reliable and efficient Snowflake platform. Technical Skills Proven experience as a MongoDB/Cassandra/Snowflake Databases Administrator or similar role in production support environments. 7+ years of hands-on experience as a MongoDB DBA supporting production environments. Strong understanding of MongoDB architecture, including replica sets, sharding, and aggregation framework. Proficiency in writing and optimizing complex MongoDB queries and indexes. Experience with backup and recovery solutions (e.g., mongodump, mongorestore, Ops Manager). Solid knowledge of Linux/Unix systems and scripting (Shell, Python, or similar). Experience with monitoring tools like Prometheus, Grafana, DataStax OpsCenter, or similar. Understanding of distributed systems and high-availability concepts. Proficiency in troubleshooting cluster issues, performance tuning, and capacity planning. In-depth understanding of data management (e.g. permissions, recovery, security and monitoring Understanding of ETL/ELT tools and data integration patterns. Strong troubleshooting and problem-solving skills. Excellent communication and collaboration abilities. Ability to work in a 24/7 support rotation and handle urgent production issues. Strong understanding of relational database concepts. Experience with database design, modeling, and optimization is good to have Familiarity with data security is the best practice and backup : Support & Incident Management : Provide 24/7 support for MongoDB environments, including on-call rotation. Monitor system health and respond to s, incidents, and performance degradation issues. Troubleshoot and resolve production database issues in a timely manner. Database Administration Install, configure, and upgrade MongoDB clusters in on-prem or cloud environments. Perform routine maintenance including backups, restores, indexing, and data migration. Monitor and manage replica sets, sharding, and cluster Tuning & Optimization : Analyze query and indexing strategies to improve performance. Tune MongoDB server parameters and JVM settings where applicable. Monitor and optimize disk I/O, memory usage, and CPU utilization . Security & Compliance Implement and manage access control, roles, and authentication mechanisms (LDAP, x.509, SCRAM). Ensure encryption, auditing, and compliance with data governance and security & Monitoring : Create and maintain scripts for automation of routine tasks (e.g., backups, health and checks Set up and maintain monitoring tools (e.g., MongoDB Ops Manager, Prometheus/Grafana, MMS). Documentation & Collaboration Maintain documentation on architecture, configurations, procedures, and incident reports. Work closely with application and infrastructure teams to support new releases and : Experience with MongoDB Atlas and other cloud-managed MongoDB services. MongoDB certification (MongoDB Certified DBA Experience with automation tools like Ansible, Terraform, or Puppet. Understanding of DevOps practices and CI/CD integration. Familiarity with other NoSQL and RDBMS technologies is a plus. Education qualification : Any degree (ref:hirist.tech)
Posted 1 week ago
6.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Summary We are seeking a highly experienced and results-driven Senior ETL Developer with over 6 years of professional experience in data integration, transformation, and analytics across enterprise-grade data platforms. This role requires deep expertise in ETL development, strong familiarity with cloud-based data solutions, and the ability to manage large-scale data operations. The candidate should be capable of working across complex data environments, including structured and unstructured datasets, and demonstrate fluency in handling both traditional and modern cloud data ecosystems. The ideal candidate must have strong hands-on experience with ETL tools, advanced SQL and Python scripting, big data processing, and cloud-based data services, particularly within the AWS ecosystem. This position will play a key role in the design, development, and optimization of scalable data pipelines and contribute to enterprise-level data engineering solutions, while supporting analytical and reporting needs in both Application Development (AD) and Application Maintenance Support (AMS) environments. Key Responsibilities Design, develop, and maintain efficient and scalable ETL pipelines using modern data tools and platforms, focusing on extraction, transformation, and loading of large datasets from multiple sources. Work closely with data architects, analysts, and other stakeholders to understand business data requirements and translate them into robust technical ETL solutions. Implement and optimize data loading, transformation, cleansing, and integration strategies to ensure high performance and quality in downstream applications. Develop and manage cloud-based data platforms, particularly within the AWS ecosystem, including services such as Amazon S3, EMR, MSK, and SageMaker. Collaborate with cross-functional teams to integrate data from various databases such as Snowflake, Oracle, Amazon RDS (Aurora, Postgres), DB2, SQL Server, and Cassandra. Employ scripting languages like SQL, PL/SQL, Python, and Unix shell commands to automate data transformations and monitoring processes. Leverage big data technologies such as Apache Spark and Sqoop to handle large-scale data workloads and enhance data processing capabilities. Support and contribute to data modeling initiatives using tools like Erwin and Oracle Data Modeler; exposure to Archimate will be considered an advantage. Work with scheduling and orchestration tools including Autosys, SFTP, and preferably Apache Airflow to manage ETL workflows efficiently. Troubleshoot and resolve data inconsistencies, data load failures, and performance issues across the data pipeline and cloud infrastructure. Follow best practices in data governance, metadata management, version control, and data quality frameworks to ensure compliance and consistency. Maintain documentation of ETL processes, data flows, and integration points for knowledge sharing and auditing purposes. Participate in code reviews, knowledge transfer sessions, and mentoring junior developers in ETL practices and cloud integrations. Stay up to date with evolving technologies and trends in data engineering, cloud services, and big data to proactively propose Technical Skills : ETL Tools : Experience with Talend is preferred (especially in AD and AMS functions), although it may be phased out in the Databases : Expertise in Snowflake, Oracle, Amazon RDS (Aurora, Postgres), DB2, SQL Server, and Cassandra. Big Data & Cloud : Hands-on with Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR, Amazon MSK, Amazon SageMaker, Apache Scripting : Strong skills in SQL, PL/SQL, Python; knowledge of Unix command-line is essential; R programming is optional but considered a Scheduling Tools : Working knowledge of Autosys, SFTP, and preferably Apache Airflow (training can be Data Modeling Tools : Proficiency in Erwin, Oracle Data Modeler; familiarity with Archimate is a preferred Notes : Power BI knowledge is relevant only in shared AD roles and not required for dedicated ETL and AWS roles or AMS responsibilities. The role requires strong communication skills to collaborate with technical and non-technical stakeholders, as well as a proactive mindset to identify and resolve data challenges. Must demonstrate the ability to adapt in fast-paced and changing environments while maintaining attention to detail and delivery quality. Exposure to enterprise data warehouse modernization, cloud migration projects, or real-time streaming data pipelines is considered highly advantageous. (ref:hirist.tech)
Posted 1 week ago
3.0 - 23.0 years
0 Lacs
ahmedabad, gujarat
On-site
Relay Human Cloud is a young and dynamic company dedicated to assisting some of the top US-based companies in expanding their teams internationally. With a truly global presence spanning across the US, India, Honduras, and Mexico, Relay is focused on facilitating connections with the best international talent. The company's core areas of expertise include Accounting & Finance, Administration, Operations, Space Planning, Leasing, Data Science, Data Search, Machine Learning, and Artificial Intelligence. Operating from offices in Ahmedabad and Vadodara, Relay India is committed to delivering high-quality operations with a focus on cutting-edge technologies. We are currently seeking a talented and dedicated Yardi Report Developer with a robust background in YSR reporting to join our team. In this role, you will be working closely with our US-based clients to design, develop, and maintain custom reports and data visualization solutions within the Yardi property management software. Your contributions will be instrumental in providing accurate insights to support decision-making and optimize property management operations. Key Responsibilities: - Develop and maintain custom YSR reports within the Yardi Voyager property management software. - Collaborate with business stakeholders to comprehend their reporting and data visualization requirements. - Design and develop dynamic and interactive reports and dashboards to deliver valuable insights. - Troubleshoot and address any issues related to report performance or data accuracy. - Create and update documentation for YSR reports and processes for future reference. - Keep abreast of Yardi software updates and new features, implementing them as necessary. - Assist in data extraction, transformation, and loading (ETL) processes to meet reporting needs. - Conduct Ad-hoc data analysis and reporting tasks as requested by management. - Provide training and support to end-users on YSR reporting capabilities and best practices. Qualifications: - Proficiency in English is essential due to direct interaction with US-based clients. - Bachelor's degree in computer science, Information Technology, or related fields (or equivalent work experience). - Extensive experience (2-3 years) in Yardi property management software with expertise in YSR reporting. - Strong understanding of SQL, data modeling, and data warehousing concepts. - Proficient in report development tools and technologies like Yardi Voyager, YSR, SSRS, Power BI, or similar. - Excellent problem-solving and analytical skills. - Detail-oriented with a focus on ensuring data accuracy and report quality. - Self-motivated and capable of working independently or collaboratively within a team. Preferred Qualifications: - Previous experience in the real estate or property management industry. - Familiarity with ETL tools and processes. - Knowledge of data visualization best practices.,
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Join us as an “Assistant VP" at Barclays, where you will be involved in functional design, data, end-to-end-process and controls, delivery, and functional testing. You’ll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. To Be Successful In This Role, You Should Have Support Development of dashboards in SAP Analytics Cloud and Tableau ; prefer primary experience in SAP Analytics cloud and SAP related toolsets Able to develop process workflow and manage ETL tools like SAP BW, Alteryx etc Able to provide design solutions for Internal reporting problem statement and business requirements with quick delivery using tactical solutions and able to connect with the strategic roadmap as well To act as a Business analyst supporting the function thinking from a strategic point of view delivering MI views that enables analytics and supports quick decision making. To support business on an agile basis in delivering the requirements which is critical in dev ops model Build innovative dashboards on a sprint basis with key focus on controls and governance structure Able to visually enhance an analytical view from the legacy excel/PPT model Adhere to all the IR Controls and develop and implement robust controls mechanism in all processes managed Some Other Highly Valued Skills May Include Knowledge in Business Intellgence platforms primarily in SAP Analytics cloud and able to work in data management tools Project management /scrum master capabilities to drive prioritization Experience around designing MI dashboards and insights Broad business and industry knowledge and experience You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role will be based out of Chennai. Purpose of the role To develop business capabilities for Finance through key stages of functional design, data, end-to-end-process and controls, delivery, and functional testing. Accountabilities Functional Design: leveraging best practice concepts, and in collaboration with Line SMEs, support options analysis and recommendations as part of decision making. Data Analysis/Modelling/Governance: design conceptual data model underpinning all phases of the processes, and governance requirements in accordance with GDMS standards and principles. End-to-End Process & Controls - development of target process and controls design/documentation and operational runbooks and aligning these components with organizational and role/service model design definitions. . Delivery/Implementation Support: update design/functional requirements throughout the development cycle, and resolve RAIDS related to functional requirements and business processes. Project management for change programmes that have limited technology investment. Functional Testing: develop scripts and data to test alignment to requirement definitions, ahead of user testing cycles. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You will be joining our team as a Looker Enterprise Dashboarding Specialist. Your main responsibility will be to design, develop, and optimize Looker dashboards to extract actionable insights from complex datasets. To excel in this role, you should have a solid understanding of LookML, data modeling, SQL, and data visualization best practices. You will collaborate with data analysts, engineers, and business stakeholders to create impactful reports and dashboards. Your key responsibilities will include designing, developing, and maintaining Looker dashboards and reports to support business decision-making. You will also be tasked with building and optimizing LookML models, explores, and views to ensure efficient data querying. Collaborating with data engineering teams to enhance data pipelines and model performance will be essential. Working closely with business stakeholders to comprehend reporting needs and convert them into scalable Looker solutions is also a crucial part of your role. Implementing best practices for data visualization to ensure clear and effective storytelling will be a key aspect. Furthermore, optimizing dashboard performance, developing and maintaining data governance standards for Looker usage, and conducting training sessions for internal teams to enhance self-service analytics adoption will fall under your responsibilities. Staying abreast of Looker updates, new features, and industry best practices is also expected. To qualify for this position, you should have 3-5 years of experience in data visualization, business intelligence, or analytics. Strong expertise in Looker, LookML, and SQL is a must. Experience in data modeling, familiarity with BigQuery or other cloud data warehouses, understanding of data governance, security, and role-based access control in Looker, ability to optimize dashboards for performance and usability, strong problem-solving and analytical skills with attention to detail, and excellent communication and stakeholder management skills are necessary. Preferred qualifications include experience working with ETL pipelines and data transformation processes, familiarity with Python or other scripting languages for data automation, exposure to Google Cloud Platform (GCP) and data engineering concepts, and certifications in Looker, Google Cloud, or related BI tools.,
Posted 1 week ago
5.0 - 12.0 years
0 Lacs
coimbatore, tamil nadu
On-site
You should have 5-12 years of experience in Big Data & Data related technologies. Your expertise should include a deep understanding of distributed computing principles and strong knowledge of Apache Spark. Proficiency in Python programming is required, along with experience using technologies such as Hadoop v2, Map Reduce, HDFS, Sqoop, Apache Storm, and Spark-Streaming for building stream-processing systems. You should have a good understanding of Big Data querying tools like Hive and Impala, as well as experience in integrating data from various sources such as RDBMS, ERP, and Files. Knowledge of SQL queries, joins, stored procedures, and relational schemas is essential. Experience with NoSQL databases like HBase, Cassandra, and MongoDB, along with ETL techniques and frameworks, is also expected. The role requires performance tuning of Spark Jobs, experience with AZURE Databricks, and the ability to efficiently lead a team. Designing and implementing Big Data solutions, as well as following AGILE methodology, are key aspects of this position.,
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
haryana
On-site
As a Data Science Engineer Intern at V-Patrol AI, a dynamic and forward-thinking cybersecurity organization, you will play a crucial role in developing and implementing cutting-edge machine learning and deep learning models. Your primary focus will be on creating scalable data pipelines and generating valuable insights in real-time to counter cyber threats effectively. Your responsibilities will include designing and executing machine learning and deep learning models tailored for cybersecurity applications. Additionally, you will be involved in constructing and overseeing data pipelines for both structured and unstructured data sources such as network logs and threat feeds. Integrating APIs for model deployment and ensuring seamless real-time data flow will also be a key aspect of your role. Collaboration with software engineers, analysts, and stakeholders to support data-informed decision-making processes is essential. Monitoring model performance and optimizing them for production environments will be part of your routine tasks. Furthermore, you will be responsible for conveying your findings through informative dashboards, reports, and visualizations. To excel in this role, you should hold a Bachelor's or Master's degree in data science, computer science, statistics, or a related field. Proficiency in Python, pandas, scikit-learn, and TensorFlow/PyTorch is necessary. Hands-on experience with REST APIs, Fast API/Flask, and data preprocessing techniques is crucial. Familiarity with various ML/DL models like XGBoost, LSTMs, and Transformers is expected. Exposure to cloud platforms such as AWS/GCP, ETL tools, Docker/Kubernetes, etc., would be advantageous. While not mandatory, prior experience in cybersecurity, particularly in areas like threat detection and incident response, would be beneficial. In addition to the required skills and experience, expertise in adversarial machine learning and natural language processing (NLP) will be considered a significant advantage. Having a GitHub profile or a portfolio showcasing real-world projects in data science or cybersecurity will be a strong preference. This position is an internship opportunity that requires your presence at the designated work location.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40183 Jobs | Dublin
Wipro
19418 Jobs | Bengaluru
Accenture in India
16534 Jobs | Dublin 2
EY
15533 Jobs | London
Uplers
11630 Jobs | Ahmedabad
Amazon
10667 Jobs | Seattle,WA
Oracle
9549 Jobs | Redwood City
IBM
9337 Jobs | Armonk
Accenture services Pvt Ltd
8190 Jobs |
Capgemini
7921 Jobs | Paris,France