Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
56.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Join our transformation within the RMG Data Engineering team in Hyderabad and you will have the opportunity to work with a collaborative and dynamic network of technologists. Our teams play a pivotal role in implementing data products, creating impactful visualizations, and delivering seamless data solutions to downstream systems. At Macquarie, our advantage is bringing together diverse people and empowering them to shape all kinds of possibilities. We are a global financial services group operating in 31 markets and with 56 years of unbroken profitability. You’ll be part of a friendly and supportive team where everyone - no matter what role - contributes ideas and drives outcomes. What role will you play? In this role, you will apply your expertise in big data technologies and DevOps practices to design, develop, deploy, and support data assets throughout their lifecycle. You’ll establish templates, methods, and standards while managing deadlines, solving technical challenges, and improving processes. A growth mindset, passion for learning, and adaptability to innovative technologies will be essential to your success. What You Offer Hands-on experience building, implementing, and enhancing enterprise-scale data platforms. Proficiency in big data with expertise in Spark, Python, Hive, SQL, Presto, storage formats like Parquet, and orchestration tools such as Apache Airflow. Knowledgeable in cloud environments (preferably AWS), with an understanding of EC2, S3, Linux, Docker, and Kubernetes. ETL Tools: Proficient in Talend, Apache Airflow, DBT, and Informatica, AWS Glue. Data Warehousing: Experience with Amazon Redshift and Ateina. Kafka Development Engineering: Experience with developing and managing streaming data pipelines using Apache Kafka. We love hearing from anyone inspired to build a better future with us, if you're excited about the role or working at Macquarie we encourage you to apply. What We Offer Benefits At Macquarie, you’re empowered to shape a career that’s rewarding in all the ways that matter most to you. Macquarie employees can access a wide range of benefits which, depending on eligibility criteria, include: 1 wellbeing leave day per year 26 weeks’ paid maternity leave or 20 weeks’ paid parental leave for primary caregivers along with 12 days of paid transition leave upon return to work and 6 weeks’ paid leave for secondary caregivers Company-subsidised childcare services 2 days of paid volunteer leave and donation matching Benefits to support your physical, mental and financial wellbeing including comprehensive medical and life insurance cover, the option to join parental medical insurance plan and virtual medical consultations extended to family members Access to our Employee Assistance Program, a robust behavioural health network with counselling and coaching services Access to a wide range of learning and development opportunities, including reimbursement for professional membership or subscription Hybrid and flexible working arrangements, dependent on role Reimbursement for work from home equipment About Technology Technology enables every aspect of Macquarie, for our people, our customers and our communities. We’re a global team that is passionate about accelerating the digital enterprise, connecting people and data, building platforms and applications and designing tomorrow’s technology solutions. Our commitment to diversity, equity and inclusion We are committed to fostering a diverse, equitable and inclusive workplace. We encourage people from all backgrounds to apply and welcome all identities, including race, ethnicity, cultural identity, nationality, gender (including gender identity or expression), age, sexual orientation, marital or partnership status, parental, caregiving or family status, neurodiversity, religion or belief, disability, or socio-economic background. We welcome further discussions on how you can feel included and belong at Macquarie as you progress through our recruitment process. Our aim is to provide reasonable adjustments to individuals who may need support during the recruitment process and through working arrangements. If you require additional assistance, please let us know in the application process.
Posted 3 days ago
6.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview PepsiCo Data BI & Integration Platforms is seeking a Midlevel TIBCO Messaging (EMS, Business Works) Platform technology leader, responsible for overseeing the deployment, and maintenance of on-premises and cloud infrastructure (AWS/Azure) for its North America PepsiCo Foods/Beverages business. The ideal candidate will have hands-on experience in managing and maintaining TIBCO EMS (Enterprise Messaging System) and TIBCO Business Works (BW) platforms, ensuring system stability, security, and optimal performance including - Infrastructure as Code (IaC), platform provisioning & administration, network design, security principles and automation. Responsibilities TIBCO platform administration Install, configure, upgrade, and maintain TIBCO EMS servers and BW environments. Deploy and manage TIBCO applications, including BW projects and integrations Monitoring system health and performance, identifying and resolving issues, and ensuring smooth operation of business processes. Tuning system parameters, optimizing resource utilization, and ensuring the efficient operation of applications. Collaborating with development, QA, and other teams to resolve technical issues and ensure seamless integration of applications. Developing scripts and automating tasks for administration and maintenance purposes. Configuring and managing adapters for seamless integration with various systems. Developing and managing Hawk rulebases for monitoring BW engines, adapters, and log files Cloud Infrastructure & Automation Implement and support TIBCO application migration, modernization, and transformation projects, leveraging cloud-native technologies and methodologies. Implement TIBCO cloud infrastructure policies, standards, and best practices, ensuring cloud environment adherence to security and regulatory requirements. Design, deploy and optimize cloud-based TIBCO infrastructure using Azure/AWS services that meet the performance, availability, scalability, and reliability needs of our applications and services. Drive troubleshooting of TIBCO cloud infrastructure issues, ensuring timely resolution and root cause analysis by partnering with global cloud center of excellence & enterprise application teams, and PepsiCo premium cloud partners (Microsoft, AWS, TIBCO). Establish and maintain effective communication and collaboration with internal and external stakeholders, including business leaders, developers, customers, and vendors. Develop Infrastructure as Code (IaC) to automate provisioning and management of cloud resources. Write and maintain scripts for automation and deployment using PowerShell, Python, or Azure/AWS CLI. Work with stakeholders to document architectures, configurations, and best practices. Knowledge of cloud security principles around data protection, identity and access Management (IAM), compliance and regulatory, threat detection and prevention, disaster recovery and business continuity. Qualifications Bachelor’s degree in computer science. At least 6 to 8 years of experience in IT cloud infrastructure, architecture and operations, including security, with at least 4 years in a technical leadership role. Thorough knowledge of TIBCO EMS, BW, and related components (e.g., Adapters, Hawk). Strong understanding of Unix/Linux operating systems, as TIBCO products often run on these platforms. Proficiency in enterprise messaging concepts, including queues, topics, and message Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Strong expertise in Azure/AWS messaging technologies, real time data ingestion, data warehouses, serverless ETL, DevOps, Kubernetes, virtual machines, monitoring and security tools. Strong expertise in Azure/AWS networking and security fundamentals, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Proficient in scripting and automation tools, such as PowerShell, Python, Terraform, and Ansible. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Certifications in Azure/AWS platform administration, networking and security are preferred. TIBCO Certified Professional certifications (e.g., TIBCO EMS Administrator) are often desirable. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.
Posted 3 days ago
6.0 years
0 Lacs
India
On-site
About YipitData: YipitData is the leading market research and analytics firm for the disruptive economy and recently raised up to $475M from The Carlyle Group at a valuation over $1B. We analyze billions of alternative data points every day to provide accurate, detailed insights on ridesharing, e-commerce marketplaces, payments, and more. Our on-demand insights team uses proprietary technology to identify, license, clean, and analyze the data many of the world’s largest investment funds and corporations depend on. For three years and counting, we have been recognized as one of Inc’s Best Workplaces . We are a fast-growing technology company backed by The Carlyle Group and Norwest Venture Partners. Our offices are located in NYC, Austin, Miami, Denver, Mountain View, Seattle , Hong Kong, Shanghai, Beijing, Guangzhou, and Singapore. We cultivate a people-centric culture focused on mastery, ownership, and transparency. Why You Should Apply NOW: You’ll be working with many strategic engineering leaders within the company. You’ll report directly to the Director of Data Engineering. You will help build out our Data Engineering team presence in India. You will work with a Global team. You’ll be challenged with a lot of big data problems. About The Role: We are seeking a highly skilled Senior Data Engineer to join our dynamic Data Engineering team. The ideal candidate possesses 6-8 years of data engineering experience. An excellent candidate should have a solid understanding of Spark and SQL, and have data pipeline experience. Hired individuals will play a crucial role in helping to build out our data engineering team to support our strategic pipelines and optimize for reliability, efficiency, and performance. Additionally, Data Engineering serves as the gold standard for all other YipitData analyst teams, building and maintaining the core pipelines and tooling that power our products. This high-impact, high-visibility team is instrumental to the success of our rapidly growing business. This is a unique opportunity to be the first hire in this team, with the potential to build and lead the team as their responsibilities expand. This is a hybrid opportunity based in India. During training and onboarding, we will expect several hours of overlap with US working hours. Afterward, standard IST working hours are permitted with the exception of 1-2 days per week, when you will join meetings with the US team. As Our Senior Data Engineer You Will: Report directly to the Senior Manager of Data Engineering, who will provide significant, hands-on training on cutting-edge data tools and techniques. Build and maintain end-to-end data pipelines. Help with setting best practices for our data modeling and pipeline builds. Create documentation, architecture diagrams, and other training materials. Become an expert at solving complex data pipeline issues using PySpark and SQL. Collaborate with stakeholders to incorporate business logic into our central pipelines. Deeply learn Databricks, Spark, and other ETL toolings developed internally. You Are Likely To Succeed If: You hold a Bachelor’s or Master’s degree in Computer Science, STEM, or a related technical discipline. You have 6+ years of experience as a Data Engineer or in other technical functions. You are excited about solving data challenges and learning new skills. You have a great understanding of working with data or building data pipelines. You are comfortable working with large-scale datasets using PySpark, Delta, and Databricks. You understand business needs and the rationale behind data transformations to ensure alignment with organizational goals and data strategy. You are eager to constantly learn new technologies. You are a self-starter who enjoys working collaboratively with stakeholders. You have exceptional verbal and written communication skills. Nice to have: Experience with Airflow, dbt, Snowflake, or equivalent. What We Offer: Our compensation package includes comprehensive benefits, perks, and a competitive salary: We care about your personal life and we mean it. We offer vacation time, parental leave, team events, learning reimbursement, and more! Your growth at YipitData is determined by the impact that you are making, not by tenure, unnecessary facetime, or office politics. Everyone at YipitData is empowered to learn, self-improve, and master their skills in an environment focused on ownership, respect, and trust. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal-opportunity employer. Job Applicant Privacy Notice
Posted 3 days ago
0 years
0 Lacs
Vishakhapatnam, Andhra Pradesh, India
On-site
Company Description RASAPOORNA FOODS PRIVATE LIMITED is a company based in Visakhapatnam, Andhra Pradesh, India. We specialize in providing high-quality food products and services to our clients. Our company operates from its headquarters at HARI PRIYA HEAVEN, KRM COLONY, Seethammadharain Vishakhapatnam. We are dedicated to maintaining a high standard of excellence in our offerings and operations. Role Description We are seeking a full-time Power BI Consultant to join our team in Vishakhapatnam. This on-site role involves designing, developing, and maintaining Power BI dashboards and reports. You will be responsible for data modeling, creating ETL processes, and supporting data warehousing initiatives. The role includes analyzing business requirements, creating data visualizations, and providing insights to support decision-making. Qualifications Strong Analytical Skills Experience with Extract Transform Load (ETL) processes Proficiency in creating dashboards using Power BI Expertise in Data Modeling and Data Warehousing Excellent problem-solving and communication skills Ability to work independently and collaborate with cross-functional teams Bachelor’s degree in Computer Science, Information Technology, or a related field Experience in the food industry is a plus
Posted 3 days ago
3.0 - 6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About SpurQLabs: SpurQLabs is an independent software testing and test automation company with a mission to help our clients build exceptional quality products at speed. We specialize in test automation, performance testing, API testing, and CI/CD enablement across industries including life sciences, pharmaceuticals, and regulated environments. Job Summary: We are seeking a detail-oriented ETL Test Engineer with strong Python, SQL, and AWS skills to validate data pipelines in compliance with GxP and FDA regulations . This role involves test planning, test execution, automated validation, and ensuring high-quality, audit-ready data workflows. Key Responsibilities: Develop and execute test plans, test cases, and test scripts to validate ETL processes, data migrations, and transformations according to GxP and industry standards. Conduct functional, integration, and regression testing across various data sources and targets to ensure accurate extraction, transformation, and loading. Collaborate with data engineers, business analysts, and stakeholders to understand data mappings, business logic, and compliance needs . Build and maintain automated ETL test suites using Python and testing frameworks for continuous validation of data pipelines. Perform data profiling and quality assessments , identify discrepancies, and work with stakeholders to resolve integrity issues. Document and report test outcomes, validation findings, and defects using defined templates and issue tracking tools. Participate in validation planning, execution, and documentation aligned with regulatory guidelines, GxP, FDA, and company SOPs. Ensure traceability, auditability, and data integrity across all validation activities. Stay current on industry trends, compliance updates, and best practices in ETL testing and data validation. Contribute to process improvement and knowledge sharing within the team. Technical Skills: Mandatory: Python : For automation of ETL validation SQL : Strong skills for data querying and validation AWS Cloud Services : Especially S3 and Databricks Snowflake : Hands-on experience with cloud data warehouse Nice to Have: Experience with automated ETL testing frameworks Familiarity with data compliance frameworks (GxP, FDA, Part 11) Exposure to validation documentation tools and issue tracking systems Qualifications: Bachelor’s degree in Computer Science, Engineering, Life Sciences, or a related field 3 to 6 years of hands-on experience in ETL testing in a regulated or data-intensive environment Experience in GxP-compliant environments is strongly preferred Strong communication, analytical, and problem-solving skills
Posted 3 days ago
8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
You are as unique as your background, experience and point of view. Here, you’ll be encouraged, empowered and challenged to be your best self. You'll work with dynamic colleagues - experts in their fields - who are eager to share their knowledge with you. Your leaders will inspire and help you reach your potential and soar to new heights. Every day, you'll have new and exciting opportunities to make life brighter for our Clients - who are at the heart of everything we do. Discover how you can make a difference in the lives of individuals, families and communities around the world. Job Description: Sun Life Job Description Of The Data Modeler Role The Data Modeler will work towards design and implementation of new data structures to support the project teams delivering on ETL, Data warehouse design, managing the enterprise data model, the maintenance of the data, and enterprise data integration approaches. Technical Responsibilities Build and maintain out of standards data models to report disparate data sets in a reliable, consistent and interpretable manner. Gather, distil and harmonize data requirements and to design coherent Conceptual, logical and physical data models and associated physical feed formats to support these data flows. Articulate business requirements and build source-to-target mappings having complex ETL transformation. Write complex SQL statements and profile source data to validate data transformations. Contribute to requirement analysis and database design - Transactional and Dimensional data modelling. Normalize/ De-normalize data structures, introduce hierarchies and inheritance wherever required in existing/ new data models. Develop and implement data warehouse projects independently. Work with data consumers and data suppliers to understand detailed requirements, and to propose standardized data models. Contribute to improving the Data Management data models. Be an influencer to present and facilitate discussions to understand business requirements and develop dimension data models based on these capabilities and industry best practices. Requirements Extensive practical experience in Information Technology and software development projects of with at least 8+ years of experience in designing Operational data store & data warehouse. Extensive experience in any of Data Modelling tools – Erwin/ SAP power designer. Strong understanding of ETL and data warehouse concepts processes and best practices. Proficient in Data Modelling including conceptual, logical and physical data modelling for both OLTP and OLAP. Ability to write complex SQL for data transformations and data profiling in source and target systems Basic understanding of SQL vs NoSQL databases. Possess a combination of solid business knowledge and technical expertise with strong communication skills. Demonstrate excellent analytical and logical thinking. Good verbal & written communication skills and Ability to work independently as well as in a team environment providing structure in ambiguous situation. Good to have Understanding of Insurance Domain Basic understanding of AWS cloud Good understanding of Master Data Management, Data Quality and Data Governance. Basic understanding of data visualization tools like SAS VA, Tableau Good understanding of implementing & architecting data solutions using the informatica, SQL server/Oracle Job Category: Advanced Analytics Posting End Date: 16/09/2025
Posted 3 days ago
3.0 - 5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Hello! You've landed on this page, which means you're interested in working with us. Let's take a sneak peek at what it's like to work at Innovaccer. Growth Strategy Team at Innovaccer Innovaccer is forming a new strategic advisory team that will support healthcare organizations to better understand their opportunities and levers for maximizing outcomes, particularly in, but not limited to, value-based care arrangements and population health initiatives. This role requires a "full stack" approach to analytics, covering all parts of the analytics value chain, including data ETL and manipulation, analysis, reporting, visualizations, insights, and final deliverable creation. The ideal candidate will possess a player / coach mentality as this team matures, with the willingness and ability to roll up their sleeves and contribute in the early days and transition to growing in responsibility as we scale. This candidate will be comfortable diving into both structured and unstructured data, creating robust financial models and business cases, producing compelling visualizations and collateral, and leading the narrative on data-driven storytelling. About The Role We are looking for a Senior Manager -Advisory Services, a key role within the Advisory Services team at Innovaccer. This individual will be responsible for delivering key customer analytics (e.g. ROI models), performance analytics and slide presentations to support multiple client pursuits and engagements. The ideal candidate has a strong desire to learn about the US healthcare system, is organized and structured, has excellent written and verbal communication skills and is a fast learner. The role requires both analytical skills and creativity to articulate and communicate complex messages about healthcare and technology to a wide-ranging audience. You will be aligned with a Managing Director/Director in the US who will provide you direction on day to day work and help you learn about the company and the industry. A Day in the Life Under direction of Advisory Services leaders, engage with prospect organizations on intended business outcomes and request data assets to model potential scenarios Own, digest, and interpret data from a variety of forms, aggregated metrics in spreadsheets to unstructured formats to raw, transactional forms like medical claims Own and execute the entire analytics lifecycle, leveraging data in all its available forms to produce cogent and compelling business cases, financial models, presentations, and other executive-ready final deliverables Synthesize insights to inform strategic direction, roadmap creation, and opportunities Couple Innovaccer's technology platform-including data, software and workflow applications, analytics, and AI-with identified insights and opportunities to create prescriptive recommendations that maximize value creation and outcomes Develop findings and insights for senior leadership of prospects and clients and Innovaccer stakeholders in a clear and compelling manner Stay up-to-date with the latest analytics technologies and methodologies to enhance capabilities Build compelling presentations including client sales and engagement delivery decks, case studies, talk tracks, and visuals. Research and analyze high priority strategic clients, industry best practices and market intelligence, including industry mapping, customer profiling, competitive insights and deep dives into select solution opportunities Co-develop and maintain standardized value lever framework, segment-based pitch decks and customer case studies for use across multiple advisory pursuits and engagements Provide analytics thought partnership and data support on the design, execution, and measurement of impactful advisory services strategy initiatives Collaborate across Advisory Services, Growth Strategy, Marketing, Sales, Product, and Customer Success teams and business leaders to address business questions that can be answered effectively through data-driven modeling and insights Develop slide presentations for quarterly and annual reporting presentations Structure, manage, and write responses to RFPs What You Need Degree from a Tier 1 college with relevant degrees in Finance, Economics, Statistics, Business, or Marketing. 3-5 years of professional experience, including experience in management consulting and/or Go To Market in a technology/ software/SAAS company Strong technical aptitude, fantastic storytelling skills, with a great track record of working across sales, marketing, and technology teams Ability to identify, source, and include data elements to drive analytical models and outputs. Experience creating Excel models (identify inputs, key considerations/variables, relevant outputs) and PowerPoint presentations Familiarity with leveraging AI tools (e.g., generative AI, AI-enhanced research tools, AI-based data analysis platforms) to enhance productivity, accelerate research, generate insights, and support creative problem-solving Proactive, decisive, independent thinker and good at problem solving and conducting industry research Experience making slide presentations for internal and external audiences that articulate key takeaways Creative problem solver with the ability to back up ideas with requisite fact-based arguments Comfortable working with multiple data sources in both structured data and unstructured formats to frame a business opportunity and develop a structured path forward Strong proficiency in Excel and PowerPoint or G-Suite Willing to work in a fast-paced environment under tight deadlines Strong written and verbal communication skills, as well as the ability to manage cross-functional stakeholders Experience with analytics and financial modeling US Healthcare experience and/or a strong willingness and interest to learn this space. Specific areas of interest include: Understanding of payer/provider / patient dynamics Provider data strategy and architecture Provider advanced analytics, AI, NLP Patient experience and engagement Population Health and Care Management Utilization and cost management Risk and Quality Management Population Health Management Risk models Value-Based Care Social Determinants of Health We offer competitive benefits to set you up for success in and outside of work. Here's What We Offer Generous Leave Benefits: Enjoy generous leave benefits of up to 40 days Parental Leave: Experience one of the industry's best parental leave policies to spend time with your new addition Sabbatical Leave Policy: Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered Health Insurance: We offer health benefits and insurance to you and your family for medically related expenses related to illness, disease, or injury Pet-Friendly Office*: Spend more time with your treasured friends, even when you're away from home. Bring your furry friends with you to the office and let your colleagues become their friends, too. *Noida office only Creche Facility for children*: Say goodbye to worries and hello to a convenient and reliable creche facility that puts your child's well-being first. *India offices Where And How We Work Our Noida office is situated in a posh techspace, equipped with various amenities to support our work environment. Here, we follow a five-day work schedule, allowing us to efficiently carry out our tasks and collaborate effectively within our team. Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details.
Posted 3 days ago
8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Business Intelligence- Manager Location: Mumbai About Us StayVista is India’s largest villa hospitality brand and has redefined group getaways. Our handpicked luxury villas are present in every famous holiday destination across the country. We curate unique experiences paired with top-notch hospitality, creating unforgettable stays. Here, you will be a part of our passionate team, dedicated to crafting exceptional getaways and curating one-of-a-kind homes. We are a close-knit tribe, united by a shared love for travel and on a mission to become the most loved hospitality brand in India. Why Work With Us? At StayVista, you're part of a community where your ideas and growth matter. We’re a fast-growing team that values continuous improvement. With our skill upgrade programs, you’ll keep learning and evolving, just like we do. And hey, when you’re ready for a break, our villa discounts make it easy to enjoy the luxury you help create. Your Role As an Manager – Business Intelligence, you will lead data-driven decision-making by transforming complex datasets into strategic insights. You will optimize data pipelines, automate workflows, and integrate AI-powered solutions to enhance efficiency. Your expertise in database management, statistical analysis, and visualization will support business growth, while collaboration with leadership and cross-functional teams will drive impactful analytics strategies. About You 8+ years of experience in Business Intelligence, Revenue Management, or Data Analytics, with a strong ability to turn data into actionable insights. Bachelor’s or Master’s degree in Business Analytics, Data Science, Computer Science, or a related field. Skilled in designing, developing, and implementing end-to-end BI solutions to improve decision-making. Proficient in ETL processes using SQL, Python, and R, ensuring accurate and efficient data handling. Experienced in Google Looker Studio, Apache Superset, Power BI, and Tableau to create clear, real-time dashboards and reports. Develop, Document & Support ETL mappings, Database structures and BI reports. Develop ETL using tools such as Pentaho/Talend or as per project requirements. Participate in the UAT process and ensure quick resolution of any UAT issue or data issue. Manage different environments and be responsible for proper deployment of reports/ETLs in all client environments. Interact with Business and Product team to understand and finalize the functional requirements Responsible for timely deliverables and quality Skilled at analyzing industry trends and competitor data to develop effective pricing and revenue strategies. Demonstrated understanding of data warehouse concepts, ETL concepts, ETL loading strategy, data archiving, data reconciliation, ETL error handling, error logging mechanism, standards and best practices Cross-functional Collaboration Partner with Product, Marketing, Finance, and Operations to translate business requirements into analytical solutions. Key Metrics: what you will drive and achieve Data Driven Decision Making &Business Impact. Revenue Growth & Cost Optimization. Cross-Functional Collaboration & Leadership Impact BI & Analytics Efficiency and AI Automation Integration
Posted 3 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Experience :7-9 yrs Experience in AWS services must like S3, Lambda , Airflow, Glue, Athena, Lake formation ,Step functions etc. Experience in programming in JAVA and Python. Experience performing data analysis (NOT DATA SCIENCE) on AWS platforms Nice To Have Experience in a Big Data technologies (Terradata, Snowflake, Spark, Redshift, Kafka, etc.) Experience with data management process on AWS is a huge Plus Experience in implementing complex ETL transformations on AWS using Glue. Familiarity with relational database environment (Oracle, Teradata, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc.
Posted 3 days ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Where Data Does More. Join the Snowflake team. We are looking for people who have a strong background in data science and cloud architecture to join our AI/ML Workload Services team to create exciting new offerings and capabilities for our customers! This team within the Professional Services group will be working with customers using Snowflake to expand their use of the Data Cloud to bring data science pipelines from ideation to deployment, and beyond using Snowflake's features and its extensive partner ecosystem. The role will be highly technical and hands-on, where you will be designing solutions based on requirements and coordinating with customer teams, and where needed Systems Integrators. AS A SOLUTIONS ARCHITECT - AI/ML AT SNOWFLAKE, YOU WILL: Be a technical expert on all aspects of Snowflake in relation to the AI/ML workload Build, deploy and ML pipelines using Snowflake features and/or Snowflake ecosystem partner tools based on customer requirements Work hands-on where needed using SQL, Python, Java and/or Scala to build POCs that demonstrate implementation techniques and best practices on Snowflake technology within the Data Science workload Follow best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Maintain deep understanding of competitive and complementary technologies and vendors within the AI/ML space, and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Support other members of the Professional Services team develop their expertise Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing OUR IDEAL SOLUTION ARCHITECT - AI/ML WILL HAVE: Minimum 10 years experience working with customers in a pre-sales or post-sales technical role Skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos Thorough understanding of the complete Data Science life-cycle including feature engineering, model development, model deployment and model management. Strong understanding of MLOps, coupled with technologies and methodologies for deploying and monitoring models Experience and understanding of at least one public cloud platform (AWS, Azure or GCP) Experience with at least one Data Science tool such as AWS Sagemaker, AzureML, Dataiku, Datarobot, H2O, and Jupyter Notebooks Hands-on scripting experience with SQL and at least one of the following; Python, Java or Scala. Experience with libraries such as Pandas, PyTorch, TensorFlow, SciKit-Learn or similar University degree in computer science, engineering, mathematics or related fields, or equivalent experience BONUS POINTS FOR HAVING: Experience with Databricks/Apache Spark Experience implementing data pipelines using ETL tools Experience working in a Data Science role Proven success at enterprise software Vertical expertise in a core vertical such as FSI, Retail, Manufacturing etc Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com
Posted 3 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Syniverse is the world’s most connected company. Whether we’re developing the technology that enables intelligent cars to safely react to traffic changes or freeing travelers to explore by keeping their devices online wherever they go, we believe in leading the world forward. Which is why we work with some of the world’s most recognized brands. Eight of the top 10 banks. Four of the top 5 global technology companies. Over 900 communications providers. And how we’re able to provide our incredible talent with an innovative culture and great benefits. Who We're Looking For The Data Engineer I is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems or building new solutions from ground up. This role will work with developers, architects, product managers and data analysts on data initiatives and ensure optimal data delivery with good performance and uptime metrics. Your behaviors align strongly with our values because ours do. Some Of What You'll Do Scope of the Role: Direct Reports: This is an individual contributor role with no direct reports Key Responsibilities Create, enhance, and maintain optimal data pipeline architecture and implementations. Analyze data sets to meet functional / non-functional business requirements. Identify, design, and implement data process: automating processes, optimizing data delivery, etc. Build infrastructure and tools to increase data ETL velocity. Work with data and analytics experts to implement and enhance analytic product features. Provide life cycle support the Operations team for existing products, services, and functionality assigned to the Data Engineering team. Experience, Education, And Certifications Bachelor’s degree in Computer Science, Statistics, Informatics or related field or equivalent work experience. Software Development experience desired Experience in Data Engineer fields is desired. Experience in building and optimizing big data pipelines, architectures, and data sets: Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL databases, such as PostgreSQL, MySQL, etc. Experience with stream-processing systems: Flink, KSQL, Spark-Streaming, etc. Experience with programming languages, such as Java, Scala, Python, etc. Experience with cloud data engineering and development, such as AWS, etc. Additional Requirements Familiar with Agile software design processes and methodologies. Good analytic skills related to working with structured and unstructured datasets. Knowledge of message queuing, stream processing and scalable big data stores. Ownership/accountability for tasks/projects with on time and quality deliveries. Good verbal and written communication skills. Teamwork with independent design and development habits. Work with a sense of urgency and positive attitude. Why You Should Join Us Join us as we write a new chapter, guided by world-class leadership. Come be a part of an exciting and growing organization where we offer a competitive total compensation, flexible/remote work and with a leadership team committed to fostering an inclusive, collaborative, and transparent organizational culture. At Syniverse connectedness is at the core of our business. We believe diversity, equity, and inclusion among our employees is crucial to our success as a global company as we seek to recruit, develop, and retain the most talented people who want to help us connect the world. Know someone at Syniverse? Be sure to have them submit you as a referral prior to applying for this position.
Posted 3 days ago
2.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Proxsoft Technologies LLC is a US-registered tech consultancy delivering cutting-edge solutions in Power BI, Power Platform, AI automation, and custom ERP reporting. We specialize in serving construction, infrastructure, and enterprise clients with deep expertise in systems like Viewpoint Vista, Spectrum, Procore, Acumatica, and Microsoft Dynamics. Role Overview: We are looking for a talented Power BI Developer to join our fast-growing team. You'll work closely with data engineers, ERP analysts, and business users to build interactive, insightful, and scalable dashboards that drive decisions for Fortune 500 clients and fast-scaling businesses. Key Responsibilities: Using best practices, design, develop, and deploy Power BI dashboards, paginated reports, and embedded analytics. Connect, model, and transform data from SQL Server, Excel, SharePoint, and cloud data sources. Collaborate with clients to gather business requirements and translate them into visualizations. Build optimized DAX measures, KPIs, bookmarks, drill-throughs, and dynamic visuals. Work on data modeling, relationship architecture, and performance tuning. Integrate Power BI with Power Automate workflows and Power Apps where needed. Document technical requirements, data dictionaries, and end-user guides. Required Skills: Strong in data modeling (star schema, snowflake), ETL, and relational data concepts. 2+ years of hands-on experience with Power BI Desktop, Power BI Service, and DAX. Proficiency in T-SQL, views, stored procedures, and performance optimization. Experience working with ERP datasets (Viewpoint, Acumatica, Procore, etc. is a huge plus). Understanding of row-level security (RLS) and workspace governance. Exposure to Power Automate, Power Apps, or SSRS / Paginated Reports is a bonus. Nice to Have: Familiarity with Azure Synapse, Dataflows, Power Query (M). Knowledge of embedding Power BI in web apps or portals. Microsoft certification in DA-100 / PL-300. Experience with construction / engineering clients or financial dashboards. What We Offer: Exposure to real enterprise-grade datasets and ERP integrations. Flexible work hours (client projects follow US time zones). Opportunity to work on cutting-edge projects using Power Platform + AI. Rapid career growth with direct mentorship from senior architects and CTO. Paid tools, learning access, and certifications.
Posted 3 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Data Services Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate training and development opportunities for team members to enhance their skills. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services. - Strong understanding of cloud computing principles and architecture. - Experience with application lifecycle management and deployment strategies. - Familiarity with data integration and ETL processes. - Knowledge of security best practices in cloud environments. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Data Services. - This position is based at our Pune office. - A 15 years full time education is required.
Posted 3 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Data Services Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project goals are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also be responsible for maintaining communication with stakeholders to provide updates and gather feedback, ensuring that the applications meet the required specifications and quality standards. Your role will be pivotal in driving the success of the projects you oversee, fostering a collaborative environment, and mentoring team members to enhance their skills and performance. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate training sessions to enhance team capabilities and knowledge sharing. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services. - Strong understanding of cloud computing principles and architecture. - Experience with data integration and ETL processes. - Familiarity with application development frameworks and methodologies. - Ability to troubleshoot and resolve technical issues efficiently. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Data Services. - This position is based in Pune. - A 15 years full time education is required.
Posted 3 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft SQL Server, Firewall, EPO Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the needs of stakeholders effectively. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft SQL Server. - Strong understanding of database design and management. - Experience with performance tuning and optimization of SQL queries. - Familiarity with data integration and ETL processes. - Ability to troubleshoot and resolve database-related issues. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft SQL Server. - This position is based in Pune. - A 15 years full time education is required.
Posted 3 days ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
🔍 Job Title: Treasure Data Engineer (6+ Years Experience) 📍 Location: PAN India (Hybrid as per project needs) 🕒 Experience Required: 6+ Years 📢 We’re Hiring! We are looking for an experienced Treasure Data Engineer with 6+ years of relevant experience in building, maintaining, and optimizing large-scale customer data platforms (CDPs) using Treasure Data. This is an exciting opportunity to work with a leading organization on cutting-edge data engineering and marketing tech solutions. ✅ Required Skills: 6+ years of experience in data engineering, with at least 2+ years of hands-on experience with Treasure Data/CDPs Strong knowledge of SQL, Python/JavaScript, and data integration best practices Experience working with Treasure Workflow, Data Connectors, Segmentations, and Audience Building Experience integrating data from various sources like Salesforce, Google Analytics, Adobe, etc. Knowledge of ETL pipelines, data quality, and customer data activation Familiarity with cloud platforms (AWS/GCP) and marketing automation tools is a plus 🎯 Responsibilities: Design and implement workflows, pipelines, and data transformations in Treasure Data Collaborate with cross-functional teams to integrate customer data sources Optimize performance of existing workflows and queries Support end-users in audience building, data analysis, and campaign execution Ensure data accuracy, security, and compliance
Posted 3 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Primary Responsibility: Collect, clean, and analyze data from various sources. Assist in creating dashboards, reports, and visualizations. We are looking for a SQL Developer Intern to join our team remotely. As an intern, you will work with our database team to design, optimize, and maintain databases while gaining hands-on experience in SQL development. This is a great opportunity for someone eager to build a strong foundation in database management and data analysis. Responsibilities Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 3 days ago
0.0 - 5.0 years
0 - 0 Lacs
Chennai, Tamil Nadu
On-site
Job Description –ODI Developer Location : Equitas Office, Backside Vikatan Office, 757, Vasan Ave, Anna Salai, Thousand Lights, Chennai, Tamil Nadu 600002 Job Type: Full-Time Experience: 5+ years Job Summary: We are hiring a Lead Data Engineer to architect and lead enterprise data integration initiatives. This role requires deep technical expertise in data engineering and leadership experience. Familiarity with Oracle Data Integrator (ODI) is preferred, especially in environments using the Oracle stack. Key Responsibilities: Architect and oversee the implementation of scalable, reliable data pipelines. Define standards and best practices for data integration and ETL development. Lead a team of data engineers and mentor junior staff. Collaborate with stakeholders to understand business data needs and translate them into technical solutions. Ensure adherence to data governance, security, and compliance requirements. Requirements: 5+ years of experience in data engineering, including team leadership roles. Deep knowledge of ETL architecture and data integration frameworks. Experience with any ETL tool (ODI is mandatory). Strong SQL, data modeling, and performance tuning skills. Experience with cloud data platforms and modern data architectures. Excellent leadership, communication, and stakeholder management skills. Knowledge on real-time or near-real-time data streaming (e.g., Kafka). Job Type: Full-time Pay: ₹12,817.62 - ₹60,073.88 per month Benefits: Health insurance Provident Fund Experience: 5S: 5 years (Preferred) Location: Chennai, Tamil Nadu (Required) Work Location: In person
Posted 3 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. 🎯 Role Overview Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 3 days ago
6.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Title: Data Engineer ( GCP ) Employment Type: Full-time Work Mode: Work from Office Experience Required: 6+ Years Location: Ahmedabad / Gurugram Timing: General About the Role We are looking for an experienced Data Engineer to design, build, and optimize data systems. The ideal candidate will have strong expertise in Python, SQL, and cloud platforms, along with a passion for solving complex data challenges. Key Responsibilities Provide business analytics support to management. Analyze business results and design data collection studies. Build and maintain data pipelines and ETL processes using Python . Collaborate with analysts and data scientists to ensure data quality. Optimize database performance (indexing, partitioning, query optimization). Implement data governance and security measures. Monitor and troubleshoot data pipelines, ensuring validation and accuracy. Maintain documentation for workflows and processes. Skills Required Proficiency in Python and SQL . Experience with relational databases (MySQL, PostgreSQL, SQL Server). Knowledge of data modeling, data warehousing, and data architecture. Experience with cloud platforms ( GCP ). Proficiency in Google Cloud Platform (BigQuery, GCS) . Familiarity with version control ( Git ). What We Offer Competitive salary and industry-standard benefits. Opportunity to earn stock options in the near future. Career growth in cloud technologies with certifications in GCP . A chance to be part of a fast-growing and innovative team.
Posted 3 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title : PYTHON DEVELOPER WITH SQL AND ETL Key Skills : Python with Sql, Pyspark,Data bricks, ETL. Job Locations : Hyderabad , Pune , Bengaluru Experience :6-8 Education Qualification : Any Graduation. Work Mode : Hybrid. Employment Type : Contract. Notice Period : Immediate Job responsibilities: The candidate should have 4 yrs and above exp in Python development with SQL. Understanding of Pyspark,Data bricks . Passionate about ETL Development and problem solutions. Quickly learn new data tools and ideas Proficient in skills - Python with Sql,Pyspark,Data bricks,ETL ,AWS knowledge would be an added advantage. The candidate should be well aware of Data ways of working Knowledge of different application dev , understanding to data background.
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
punjab
On-site
The Associate Manager - BBS Analytics will be responsible for building Tableau Analytics Dashboards for multiple Global Internal Financial Controls Metrics. You will work with teams within Bunge Business Services to enable full visibility of Bunge's Internal Financial Controls. Your primary task will be to transform business and process data into actionable insights for business disclosures, decisions, and opportunities using data engineering and visualization tools, with a focus on expertise in visualization tool Tableau and Oracle SQL. You will be responsible for designing and delivering various reports, standard Tableau dashboards, ad hoc reports, templates, scorecards, and metrics to drive insights focused on business issues and priorities. Additionally, you will implement and automate business needs on the Online Business Intelligence tool for real-time Control effectiveness and efficiency analytics. It will be crucial for you to understand all aspects of Bunge's Control Metrics, especially reporting and compliance needs. You will collaborate with various stakeholders both internally and externally, with a strong emphasis on building partnerships and appropriately influencing to gain commitment. In this role, you will drive results through high standards, focus on key priorities, organization, and preparing others for change. Your technical skills should encompass a strong working knowledge of Accounting, ESG, Procurement, Agri contracts, SAP FICO/SD/MM, with business process knowledge of Finance Operations, business intelligence/reporting, data analysis and visualization. Additionally, you should have detailed knowledge and experience in BI, Reporting, Analysis, Data Visualization, and Visual Storytelling. The ability to make complex data science models and statistical inferences information clear and actionable will be essential. You should have extensive understanding of Controls Processes, Performance Metrics, and Governance, with significant experience driving large projects to successful completion. Being an Agile Practitioner and having Design Thinking expertise will be advantageous. Strong communication and presentation skills, collaboration skills, and integrity to hold self and others accountable to deliver against commitments are important attributes for this role. You will lead client engagements and oversee work-streams related to PTP, OTC, RTR. Additionally, you will develop solutions to customer challenges, identify gaps, and areas of improvement for dashboard building. Your responsibilities will include gathering requirements from functional stakeholders, conducting UAT with business users, working with Ops team to deploy the use case in production, and engaging with operations team to streamline and improve technical environment, access provisioning, and reporting processes. Managing engagement economics, project resources, team utilization, and delivering high-quality deliverables will be part of your role. You should have a strong competency in Tableau, Oracle, Python, R, MS Excel & PowerPoint and working knowledge of other enabling tools for a business services command center. Competencies in Data Analytics and Big Data tools and platforms will be beneficial. A relevant experience of 4 to 8 years with a Masters in Business Analytics, Finance, ESG, or Data Science from a premier institute/university will be preferred. Bunge (NYSE: BG) is a world leader in sourcing, processing, and supplying oilseed and grain products and ingredients. Founded in 1818, Bunge's expansive network feeds and fuels a growing world, creating sustainable products and opportunities for more than 70,000 farmers and the consumers they serve across the globe. The company is headquartered in St. Louis, Missouri and has 25,000 employees worldwide who stand behind more than 350 port terminals, oilseed processing plants, grain facilities, and food and ingredient production and packaging facilities around the world.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
maharashtra
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your role will involve 4-5 years of ETL testing and data validation. You should be experienced in ETL Testing, Requirement Analysis, Test Planning, Data Validation, and Defect Tracking. Proficiency in SQL is required for writing complex queries to validate data. Additionally, knowledge of ETL tools, experience with data warehousing concepts and methodologies, and strong analytical and problem-solving skills are desirable. Exposure to Agile/Scrum methodologies and experience in AWS, PySpark, Databricks, or any other cloud-based test execution would be beneficial. Your profile should include experience in ETL Testing, Requirement Analysis, Test Planning, Data Validation, and Defect Tracking. Proficiency in SQL for writing complex queries to validate data is essential. At Capgemini, you will have the opportunity to make a difference for the world's leading businesses or for society. You will receive the support needed to shape your career in a way that works for you. When the future doesn't look as bright as you'd like, you will have the opportunity to make a change and rewrite it. By joining Capgemini, you become part of a diverse collective of free-thinkers, entrepreneurs, and experts all working together to unleash human energy through technology for an inclusive and sustainable future. Capgemini values its people and offers extensive Learning & Development programs for career growth. The work environment is inclusive, safe, healthy, and flexible to bring out the best in you. You can also take an active role in Corporate Social Responsibility and Sustainability initiatives to make a positive social change and build a better world. Capgemini is a global business and technology transformation partner with over 55 years of heritage, trusted by clients to unlock the value of technology and address their business needs. The company has a diverse group of 340,000 team members in more than 50 countries. Capgemini delivers end-to-end services and solutions leveraging AI, cloud, data, and deep industry expertise to create tangible impact for enterprises and society. The Group reported 2023 global revenues of 22.5 billion. Skills required for this role include SQL, ETL, Python, Scala, and SQL + ETL.,
Posted 3 days ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will also be required to monitor and control all phases of the development process, provide user and operational support on applications to business users, and recommend and develop security measures in post-implementation analysis. As the Applications Development Senior Programmer Analyst, you will utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, evaluate business and system processes, recommend advanced programming solutions, and ensure that essential procedures are followed. Additionally, you will serve as an advisor or coach to new or lower-level analysts, operate with a limited level of direct supervision, and act as a subject matter expert to senior stakeholders and other team members. To qualify for this role, you should have 8-12 years of relevant experience in systems analysis and programming of software applications, managing and implementing successful projects, and working knowledge of consulting/project management techniques/methods. You should also have the ability to work under pressure, manage deadlines, and adapt to unexpected changes in expectations or requirements. A Bachelor's degree or equivalent experience is required for this position. In addition to the general job description, the ideal candidate should have 8 to 12 years of Application development experience through the full lifecycle with expertise in UI architecture patterns such as Micro Frontend and NX. Proficiency in Core Java/J2EE Application, Data Structures, Algorithms, Hadoop, Map Reduce Framework, Spark, YARN, and other relevant technologies is essential. Experience with Big Data Spark ecosystem, ETL, BI tools, agile environment, test-driven development, and optimizing software solutions for performance and stability is also preferred. This job description provides an overview of the responsibilities and qualifications for the Applications Development Senior Programmer Analyst role. Other job-related duties may be assigned as required.,
Posted 3 days ago
5.0 years
0 Lacs
Haryana, India
On-site
Senior Data Engineer (C11) Analytics & Information Management (AIM), Gurugram Excited to grow your career? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position, you see is right for you, we encourage you to apply! Our people make all the difference in our success. We are seeking a highly experienced and strategic Officer – Sr. Data Engineer for Data/Information Management Team. The ideal candidate will be responsible for development and implementation of data analytics solutions to support key business objectives for Legal Operations as part of COO (Chief Operating Office). This role requires proven track record of implementing optimized data processes/platforms, delivering impactful insights, and fostering a data-driven culture. ------------------------------------------------------ The Data/Information Analyst accomplishes results by contributing significantly to the bank's success by leveraging data engineering & solution design skills within specialized domain. Integrates subject matter and industry expertise within a defined area. Contributes to standards around which others will operate. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the entire function. Requires basic commercial awareness. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Has responsibility for volume, quality, timeliness and delivery of end results of an area. Responsibilities: Incumbents would be primarily responsible for supporting Business Execution activities Chief Operating Office, implement data engineering solutions to manage banking operations. Establish monitoring routines, scorecards and escalation workflows Oversee the Data Strategy, Smart Automation, Insight Generation, Data Quality and Reporting activities using proven analytical techniques. Responsible for documenting data requirements, data collection / processing / cleaning, which may include Process Automation / Optimization and data visualization techniques. Enable proactive issue detection, escalation workflows, and alignment with firmwide Data Related policies, Implement a governance framework with clear stewardship roles and data quality controls Interface between business and technology partners for digitizing data collection, including performance generation, validation rules for banking operations. Build Data Strategy by identifying all relevant product processors, create Data Lake, Data Pipeline, Governance & Reporting Communicate findings and recommendations to senior management. Stay current with the latest trends and technologies in analytics. Ensure compliance with data governance policies and regulatory requirements. Setup a governance operating framework to enable operationalization of data domains, identify CDEs and Data Quality rules. Align with Citi Data Governance Policies and firmwide Chief Data Office expectations Incumbents work with large and complex data sets (both internal and external data) to evaluate, recommend, and support the implementation of business strategies like Centralized data repository with standardized definitions and scalable data pipes Identifies and compiles data sets using a variety of tools (e.g. SQL, Access) to help predict, improve, and measure the success of key business to business outcomes. Implement rule-based Data Quality checks across critical data points. Automate alerts for breaks and publish periodic quality reports Incumbents in this role may often be referred to as Data Analyst. Develop and execute the analytics strategy – Data Ingestion, Reporting / Insights Centralization, Ensure consistency, lineage tracking, and audit readiness across legal reporting Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency, as well as effectively supervise the activity of others and create accountability with those who fail to maintain these standards. Work as a senior member in a team of data engineering professionals, working with them to deliver on organizational priorities Qualifications: 5+ years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like Python, PySpark, Tableau, MicroStrategy, SQL etc. Strong understanding of Data Transformation – Data Strategy, Data Architecture, Data Tracing & Lineage (ability to trace data lineage from source systems to data warehouse to reports and dashboards), Scalable Data Flow Design and Standardization, Platform Integration, ETL & Smart Automation Conceptual, logical, and physical data modeling expertise. Proficiency in relational and dimensional data modeling techniques. Ability and experience in designing data warehouses, integrated data marts, and optimized reporting schemas that cater to multiple BI tools Database Management & Optimization. Expertise in database performance tuning and optimization for data enrichment and integration, reporting and dashboarding Strong understanding of data platforms / ecosystem, establish a scalable data management framework – data provisioning, process optimization, actionable insights, visualization techniques using Tableau Solution Architect with proven ability to translate complex data flows into automated & optimized solutions. Ability to leverage data analytics tools & techniques for analytics problem solving for organizational needs Experience in Developing and Deploying AI solutions in partnership with Tech and Business Experience with any banking operations (e.g., expense analytics, movement of funds, cash flow management, fraud analytics, ROI). Knowledge of regulatory requirements related to data privacy and security Experience in interacting with senior stakeholders across the organization to be able to manage end-to-end conceptualization & implementation of data strategies - standardization data structures, identify and remove redundancies to optimize data feeds AI / Gen AI proficiency and thought leadership in Financial/Business Analysis and/or credit/risk analysis with ability to impact key business drivers via a disciplined analytic process Demonstrate Analytics thought leadership skills & project planning capabilities In-depth understanding of the various financial service business models, expert knowledge of advanced statistical techniques and how to apply the techniques to drive substantial business results Creative problem-solving skills Education: Bachelors/University degree in STEM, Master’s degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Time Type :Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Job Level :C11 ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Data/Information Management ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills MicroStrategy, Python (Programming Language), Structured Query Language (SQL), Tableau (Software). ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France