Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 10.0 years
9 - 13 Lacs
Pune
Work from Office
Job Requirements Why work for us Alkegen brings together two of the world s leading specialty materials companies to create one new, innovation-driven leader focused on battery technologies, filtration media, and specialty insulation and sealing materials. Through global reach and breakthrough inventions, we are delivering products that enable the world to breathe easier, live greener, and go further than ever before. With over 60 manufacturing facilities with a global workforce of over 9,000 of the industry s most experienced talent, including insulation and filtration experts, Alkegen is uniquely positioned to help customers impact the environment in meaningful ways. Alkegen offers a range of dynamic career opportunities with a global reach. From production operators to engineers, technicians to specialists, sales to leadership, we are always looking for top talent ready to bring their best. Come grow with us! UI/UX & Visualization Leadership Design and deliver intuitive, interactive, and visually compelling dashboards and analytics solutions. Champion user-centered design principles and best practices across the data analytics lifecycle. Ensure consistency, usability, and accessibility in all visual outputs with strong focus on enriching user experience. Stakeholder Engagement Collaborate with business stakeholders to gather requirements, define KPIs, and translate them into actionable visualizations. Present insights and design concepts effectively to both technical and non-technical audiences. Business Intelligence & Data Tools Develop and optimize dashboards using Tableau, Power BI, and SQL. Ensure performance, scalability, and maintainability of BI solutions. Project & Prioritization Management Manage multiple projects with competing priorities, ensuring timely delivery and alignment with business goals. Apply Agile/Scrum methodologies and CI/CD practices to streamline development workflows. Data Architecture & Engineering Collaboration Work closely with data engineers and architects to align UI/UX design with data models and pipelines. Collaborate with the data operations team for developing the required data marts, ERP and CRM team to collect input mappings Understand and apply concepts of Azure Data Factory (ADF), data modeling, and data architecture. Storytelling & Business Acumen Translate complex data into compelling narratives that drive strategic decisions. Demonstrate strong business acumen and a deep understanding of enterprise KPIs and metrics. People Management & Leadership Lead, mentor, and inspire a team of UI/UX designers and data visualization specialists. Foster a collaborative, innovative, and high-performance team culture. Conduct performance reviews, provide constructive feedback, and support career development. Coordinate with cross-functional teams to ensure alignment and effective communication. Required Skills & Qualifications: 7 10 years of experience in UI/UX design within data analytics or business intelligence environments. Proficiency in Tableau, Power BI, and SQL. Strong understanding of CI/CD, DevOps, and Scrum methodologies. Experience with Azure Data Factory (ADF) and modern data architecture principles. Expertise in data modeling, ETL processes, and data visualization best practices. Excellent communication, storytelling, and stakeholder management skills. Demonstrated experience in leading teams, managing performance, and fostering talent development. Preferred Qualifications: Certification in Tableau, Power BI, or Azure Data Services. Experience in cloud-based analytics platforms (Azure, AWS, GCP). Familiarity with Figma, Adobe XD, or other UI/UX design tools. At Alkegen, we strive every day to help people ALL PEOPLE breathe easier, live greener and go further than ever before. We believe that diversity and inclusion is central to this mission and to our impact. Our diverse and inclusive culture drives our growth & innovation and we nurture it by actively embracing our differences and using our varied perspectives to solve the complex challenges facing our changing and diverse world. Employment selection and related decisions are made without regard to sex, race, ethnicity, nation of origin, religion, color, gender identity and expression, age, disability, education, opinions, culture, languages spoken, veteran s status, or any other protected class.
Posted 1 week ago
4.0 - 9.0 years
0 - 0 Lacs
Hyderabad, Chennai
Hybrid
Job Description: Design, develop, and maintain data pipelines and ETL processes using AWS and Snowflake. Implement data transformation workflows using DBT (Data Build Tool). Write efficient, reusable, and reliable code in Python. Optimize and tune data solutions for performance and scalability. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in data engineering. Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Proven experience as a Data Engineer or similar role. Strong proficiency in AWS and Snowflake. Expertise in DBT and Python programming. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud platforms and services. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities.
Posted 1 week ago
7.0 - 12.0 years
4 - 8 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
LWC Developer/MHE/Hemachandran We are seeking a highly skilled and experienced Senior Salesforce Developer with deep expertise in Lightning Web Components (LWC) . The ideal candidate will have strong hands-on experience in developing scalable Salesforce solutions, leveraging Apex, LWC, and modern Salesforce best practices. You will work closely with business analysts, architects, and stakeholders to deliver high-impact CRM capabilities that drive business value. Design, develop, test, and deploy high-quality LWC -based solutions on the Salesforce platform. Build reusable components and front-end libraries using LWC. Write Apex classes , triggers, and test classes to support business logic and integrations. Integrate Salesforce with external systems using REST/SOAP APIs and middleware platforms (e.g., MuleSoft). Collaborate with functional consultants, QA engineers, and admins to understand business requirements. Participate in code reviews, design discussions, and agile ceremonies. Optimize application performance and ensure security best practices. Stay up-to-date with the latest Salesforce releases and features. Required Qualifications: Bachelor s degree in Computer Science, Engineering, or a related field. 5+ years of experience as a Salesforce Developer. 3+ years specifically working with Lightning Web Components (LWC) . Strong proficiency in Apex, SOQL, SOSL, Visualforce, and JavaScript. Experience with Salesforce Data Model, Security Model, and Governor Limits. Hands-on experience in CI/CD tools (e.g., Git, Jenkins, SFDX). Experience integrating Salesforce with third-party platforms using APIs. Salesforce Platform Developer I certification (Developer II preferred). Apex, Integraion, Lwc
Posted 1 week ago
3.0 - 5.0 years
4 - 5 Lacs
Mumbai
Work from Office
Jul 3, 2025 Location: Mumbai Designation: Analyst Entity: Deloitte Touche Tohmatsu India LLP Your potential, unleashed. India s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team Deloitte helps organizations prevent cyberattacks and protect valuable assets. We believe in being secure, vigilant, and resilient not only by looking at how to prevent and respond to attacks, but at how to manage cyber risk in a way that allows you to unleash new opportunities. Embed cyber risk at the start of strategy development for more effective management of information and technology risks. Your work profile. Description Develop the strategic narrative to drive the direction, development, standards, for developing persona based cyber-Risk reporting & Visualization capability of large organization with an objective to make leadership (C-Level) aware of cyber security posture for taking informed decisions. Likewise, also cater to personas such as Audit, IT and Cyber operations teams. Interlock with various stakeholders and understand the requirements, translate them into storytelling based visual representation using reports and dashboards. Guide and direct delivery of cohesive, end-to-end visual experience on dashboards including simplifying quantifiable cyber security related information to various technical and non-technical personas. Understand various perspectives of users and embed principle of visual design in dashboard development process with focus on persona specific insights requirement data and user s actions based on the insights. Collaborate with User, delivery team and key stakeholders to influence design and adoption of the dashboard in the organization. Ideal candidate to have 3-5 years of proven experience in visual design, design strategy, experience strategy, design thinking, and human-centered design for enterprise-wide reporting solutions. Context & Main Purpose of Role Build UI/UX strategy based on data storytelling principles and support Cyber Risk reporting program through development of Interactive and contextualized Power BI dashboards which convey cyber risk posture to C-level executives. Rigorous focus on adherence to design principles defined by client s Power BI guidelines while bringing creative, simple but intuitive visuals on dashboards to communicate the Cyber risk to non-technical audience. Be part of dynamic Cyber Risk reporting team and collaborate with Data engineers, Power BI developers and Cyber Risk SMEs to create an impact through user-centric dashboard design. Required Qualifications, capabilities, and skills: Bachelors Degree with curriculum including business, mathematics, UX/UI design with story-telling, or equivalent working experience. Portfolio of work demonstrating effective visual communication of quantitative information related to Risk, specifically visually appealing dashboards. Experience working with data analytics teams and strong understanding of common challenges to measure and communicate cyber risk to non-technical leadership. Experience working on fast-paced, cross-functional teams in demanding business environments. Practical experience with tools for business intelligence, quantitative graphical analysis, and UX/UI design (e.g., Power BI, Figma) Excellent communication skills, with ability to share ideas concisely, clearly, and accurately. Experience in building dashboards along with ability to build C-level stakeholders dashboards. Developing visual reports, dashboards, and KPI scorecards. Knowledge on Connecting multiple data sources, importing data, and transforming data for Business Intelligence. Excellent in analytical thinking for translating data into informative visuals and reports. Should have inclination to understand cyber security related concepts which may help in dashboarding project to improve user satisfaction of dashboard users. Understand fundamentals of data preparation/data modeling necessary for the visualization purpose. Capture reporting requirements from various partners, architect the solution/report, Understand/analyze the source data and deliver the reports in timely manner. Strong expertise in Crafting intuitive and interactive reports and dashboards for data driven decisions. Proficiency in Microsoft Power BI, including Power BI Desktop, Power BI Service, and Power Query. Strong understanding of DAX (Data Analysis Expressions) and its application in data modeling. Familiarity with other Microsoft tools such as Excel, Azure, and SharePoint are a plus. Experience with Agile/Scrum methodologies is advantageous. Job Responsibilities: Collaborate across stakeholders, including developers, data engineers, and SMEs, to understand and align on business objectives and data requirements. Connect to various data sources and ensure data integrity, accuracy, and consistency. Optimize Power BI solutions for performance and usability. Create and maintain Power BI data models, including measures, calculated columns, and DAX expressions. Develop compelling data-driven narratives that effectively communicate insights and recommendations to various audiences, such as senior executives, departmental leaders, and managers. Provide guidance and support to other Power BI developers in creating visually appealing and accessible data visualizations. Apply visual design principles to ensure a positive user experience in the presentation of quantitative information. Conduct business and data analysis to uncover actionable insights. Ensure compliance with all applicable design principles. How you ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report . Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone s welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you re applying to. Check out recruiting tips from Deloitte professionals. *Caution against fraudulent job offers*: We would like to advise career aspirants to exercise caution against fraudulent job offers or unscrupulous practices. At Deloitte, ethics and integrity are fundamental and not negotiable. We do not charge any fee or seek any deposits, advance, or money from any career aspirant in relation to our recruitment process. We have not authorized any party or person to collect any money from career aspirants in any form whatsoever for promises of getting jobs in Deloitte or for being considered against roles in Deloitte. We follow a professional recruitment process, provide a fair opportunity to eligible applicants and consider candidates only on merit. No one other than an authorized official of Deloitte is permitted to offer or confirm any job offer from Deloitte. We advise career aspirants to exercise caution. In this regard, you may refer to a more detailed advisory given on our website at: https: / / www2.deloitte.com / in / en / careers /
Posted 1 week ago
7.0 - 13.0 years
20 - 25 Lacs
Hyderabad, Bengaluru
Work from Office
Scope: Core responsibilities include upgrade & migrate the customers from existing product version to latest version with different platform & infrastructure. Part of SCPO to next gen cognitive product migration/implementation & support SCPO Implementation working closely with consulting & partner collaborating with PD & support. What you ll do: Work on customer s Implementation (net new customer) and support partners during implementation taking care of all cloud & SCPO related issues, help resolving the issues by working closely with PD/support/customer & partners. Work on customers upgrade & migration to SAAS (SCPO/next get planning products) by understating the existing version of technical design (batch / integration / metrics / infrastructure / sizing etc) & migrate them to latest available supported version of SCPO/planning solutions. What we are looking for: Should have hands-on experience in SQL, PL/SQL, Oracle Should have hands on experience in scripting (batch/perl/Python, any) & understanding of performance tuning. Domain knowledge in the Supply Chain Planning & Optimization Suite of Blueyonder products at least 2+yrs Preferable to have implementation or operational support experience on Supply Chain Planning &Optimization Suite of Blueyonder products. Should have understanding of any job scheduler & preferable control-m batch scheduling and understand batch processing design and SLA s. Candidate should be able to Estimate the workload of implementation/migration activities that are within the scope, assess business needs that fall outside the standard solution, assess integration complexity, architecture Conduct technical analysis sessions with customers and/or the project team to understand the functional and technical needs of the customer and develop technical solutions to meet those needs Design and develop platform workflow/ user interface, integration scripts, Data mapping, data modeling and loading of data. Should have understanding of cloud & SAAS applications & experience in using JIRA and ServiceNow/SalesForce. Excellent Communication skills required to navigate discussions with internal and external stakeholders on issues and troubleshooting. Our current technical environments/ Technical Stack & products: OS Unix-RHEL, Windows 2008R2 and above. Supply chain products: SCPO (Demand & FF, ESP,IO, S&OP) RDBMS Oracle 11g and above and SQL/PLSQL 2012 and above, Snowflake BA tools Cognos. Interface: SFTP, AS2, DMS, JDA connect Scripting Perl, Shell, Python is desirable. Scheduling tools knowledge Control-M is preferred one If you want to know the heart of a company, take a look at their values. Ours unite us. They are what drive our success and the success of our customers. Does your heart beat like oursFind out here: Core Values
Posted 1 week ago
17.0 - 19.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Req ID: 318209 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Salesforce Datacloud Agentforce Solution Architects to join our team in Hyderabad, Telangana (IN-TG), India (IN). "Job Duties: Salesforce Data Cloud Agentforce"" Experience in designing, developing, and implementing AI-powered conversational experiences within the Salesforce platform, using Agentforce capabilities to create automated customer interactions across various channels, often requiring strong technical skills in Salesforce development and natural language processing (NLP) to build effective virtual agents. Core Responsibilities: Architecting and building data integration solutions using Salesforce Data Cloud to unify customer data from diverse sources. Implementing data cleansing, matching, and enrichment processes to improve data quality. Designing and managing data pipelines for efficient data ingestion, transformation, and loading. Collaborating with cross-functional teams to understand business requirements and translate them into data solutions Monitoring data quality, identifying discrepancies, and taking corrective actions Establishing and enforcing data governance policies to maintain data consistency and compliance Minimum Skills Required: Technical Skills: Expertise in Salesforce Data Cloud features like data matching, data cleansing, data enrichment, and data quality rules Understanding of data modeling concepts and ability to design data models within Salesforce Data Cloud Proficiency in using Salesforce Data Cloud APIs and tools to integrate data from various sources Knowledge of data warehousing concepts and data pipeline development Relevant Experience: Implementing Salesforce Data Cloud for customer 360 initiatives, creating a unified customer view across channels Designing and developing data integration solutions to connect disparate data sources Managing data quality issues, identifying and resolving data inconsistencies Collaborating with business stakeholders to define data requirements and KPIs Agentforce Design and Development: Building and customizing Agentforce conversational flows, defining dialogue trees, and creating intent-based interactions to handle customer inquiries effectively. NLP Integration: Training and refining natural language processing models to ensure accurate understanding of customer queries and provide relevant responses. Data Analysis and Optimization: Monitoring Agentforce performance, analyzing customer interaction data to identify areas for improvement and refine bot responses. Salesforce Integration: Seamlessly integrating Agentforce with other Salesforce components like CRM data, sales pipelines, and customer support systems to provide a unified customer experience. Testing and Deployment: Thoroughly testing Agentforce interactions to ensure quality and functionality before deployment across various channels (webchat, SMS, etc.). Skills to highlight on your resume: Salesforce Platform: Expertise in Salesforce administration, development (Apex, Visualforce), and understanding of Salesforce architecture. Agentforce Specifics: Deep knowledge of Agentforce features, capabilities, and configuration options. NLP Techniques: Familiarity with natural language processing concepts like intent classification, entity extraction, and dialogue management. Conversational Design: Proven ability to design engaging and effective conversational flows for virtual agents. Data Analysis: Skills in analyzing customer interaction data to identify patterns and optimize Agentforce performance. Design, develop, and deploy solutions on Salesforce Data Cloud platform. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Build custom applications, integrations, and data pipelines using Salesforce Data Cloud tools and technologies. Develop and optimize data models to support business processes and reporting needs. Implement data governance and security best practices to ensure data integrity and compliance. Perform troubleshooting, debugging, and performance tuning o" We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in RD to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com
Posted 1 week ago
13.0 - 14.0 years
50 - 55 Lacs
Bengaluru
Work from Office
You are a strategic thinker passionate about driving solutions in financial analysis. You have found the right team. As a Data Domain Architect Lead - Vice President within the Finance Data Mart team, you will be responsible for overseeing the design, implementation, and maintenance of data marts to support our organizations business intelligence and analytics initiatives. You will collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. You will lead the development of robust data models to ensure data integrity and consistency, and oversee the implementation of ETL processes to populate data marts with accurate and timely data. You will optimize data mart performance and scalability, ensuring high availability and reliability, while mentoring and guiding a team of data mart developers. Job Responsibilities Lead the design and development of data marts, ensuring alignment with business intelligence and reporting needs. Collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. Develop and implement robust data models to support data marts, ensuring data integrity and consistency. Oversee the implementation of ETL (Extract, Transform, Load) processes to populate data marts with accurate and timely data. Optimize data mart performance and scalability, ensuring high availability and reliability. Monitor and troubleshoot data mart issues, providing timely resolutions and improvements. Document data mart structures, processes, and procedures, ensuring knowledge transfer and continuity. Mentor and guide a team of data mart developers if needed, fostering a collaborative and innovative work environment. Stay updated with industry trends and best practices in data warehousing, data modeling, and business intelligence. Required qualifications, capabilities, and skills Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Extensive experience in data warehousing, data mart development, and ETL processes. Strong expertise in Data Lake, data modeling and database management systems (e.g., Databricks, Snowflake, Oracle, SQL Server, etc.). Leadership experience, with the ability to manage and mentor a team. Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills to work effectively with cross-functional teams. Preferred qualifications, capabilities, and skills Experience with cloud-based data solutions (e.g., AWS, Azure, Google Cloud). Familiarity with advanced data modeling techniques and tools. Knowledge of data governance, data security, and compliance practices. Experience with business intelligence tools (e.g., Tableau, Power BI, etc.). Candidates must be able to physically work in our Bengaluru Office in evening shift - 2 PM to 11PM IST. The specific schedule will be determined and communicated by direct management. You are a strategic thinker passionate about driving solutions in financial analysis. You have found the right team. As a Data Domain Architect Lead - Vice President within the Finance Data Mart team, you will be responsible for overseeing the design, implementation, and maintenance of data marts to support our organizations business intelligence and analytics initiatives. You will collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. You will lead the development of robust data models to ensure data integrity and consistency, and oversee the implementation of ETL processes to populate data marts with accurate and timely data. You will optimize data mart performance and scalability, ensuring high availability and reliability, while mentoring and guiding a team of data mart developers. Job Responsibilities Lead the design and development of data marts, ensuring alignment with business intelligence and reporting needs. Collaborate with business stakeholders to gather and understand data requirements, translating them into technical specifications. Develop and implement robust data models to support data marts, ensuring data integrity and consistency. Oversee the implementation of ETL (Extract, Transform, Load) processes to populate data marts with accurate and timely data. Optimize data mart performance and scalability, ensuring high availability and reliability. Monitor and troubleshoot data mart issues, providing timely resolutions and improvements. Document data mart structures, processes, and procedures, ensuring knowledge transfer and continuity. Mentor and guide a team of data mart developers if needed, fostering a collaborative and innovative work environment. Stay updated with industry trends and best practices in data warehousing, data modeling, and business intelligence. Required qualifications, capabilities, and skills Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Extensive experience in data warehousing, data mart development, and ETL processes. Strong expertise in Data Lake, data modeling and database management systems (e.g., Databricks, Snowflake, Oracle, SQL Server, etc.). Leadership experience, with the ability to manage and mentor a team. Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills to work effectively with cross-functional teams. Preferred qualifications, capabilities, and skills Experience with cloud-based data solutions (e.g., AWS, Azure, Google Cloud). Familiarity with advanced data modeling techniques and tools. Knowledge of data governance, data security, and compliance practices. Experience with business intelligence tools (e.g., Tableau, Power BI, etc.). Candidates must be able to physically work in our Bengaluru Office in evening shift - 2 PM to 11PM IST. The specific schedule will be determined and communicated by direct management.
Posted 1 week ago
5.0 - 10.0 years
5 - 9 Lacs
Pune
Work from Office
Job Description: KPI Partners is seeking an experienced Senior Snowflake Administrator to join our dynamic team. In this role, you will be responsible for managing and optimizing our Snowflake environment to ensure performance, reliability, and scalability. Your expertise will contribute to designing and implementing best practices to facilitate efficient data warehousing solutions. Key Responsibilities: - Administer and manage the Snowflake platform, ensuring optimal performance and security. - Monitor system performance, troubleshoot issues, and implement necessary solutions. - Collaborate with data architects and engineers to design data models and optimal ETL processes. - Conduct regular backups and recovery procedures to protect data integrity. - Implement user access controls and security measures to safeguard data. - Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. - Participate in the planning and execution of data migration to Snowflake. - Provide support for data governance and compliance initiatives. - Stay updated with Snowflake features and best practices, and provide recommendations for continuous improvement. Qualifications: - Bachelors degree in Computer Science, Information Technology, or a related field. - 5+ years of experience in database administration, with a strong focus on Snowflake. - Hands-on experience with SnowSQL, SQL, and data modeling. - Familiarity with data ingestion tools and ETL processes. - Strong problem-solving skills and the ability to work independently. - Excellent communication skills and the ability to collaborate with technical and non-technical stakeholders. - Relevant certifications in Snowflake or cloud data warehousing are a plus. If you are a proactive, detail-oriented professional with a passion for data and experience in Snowflake administration, we would love to hear from you. Join KPI Partners and be part of a team that is dedicated to delivering exceptional data solutions for our clients.
Posted 1 week ago
3.0 - 8.0 years
8 - 9 Lacs
Bengaluru
Work from Office
Amazon, Earths most customer-centric company, offers low prices, vast selection, and convenience through its world-class e-commerce platform. The Competitive Pricing team ensures customer trust through optimal pricing across all Amazon marketplaces. Within this organization, our Data Engineering team, part of the Pricing Big Data group, builds and maintains the global pricing data platform. We enable price competitiveness by processing data from multiple sources, creating actionable pricing dashboards, providing deep-dive analytics capabilities, and driving operational efficiency. As a Data Engineer, you will collaborate with technical and business teams to develop real-time data processing solutions. You will lead the architecture, design, and development of the pricing data platform using AWS technologies and modern software development principles. Your responsibilities will include architecting and implementing automated Business Intelligence solutions, designing scalable big data and analytical capabilities, and creating actionable metrics and reports for engineers, analysts, and stakeholders. In this role, you will partner with business leaders to drive strategy and prioritize projects. Youll develop and review business cases, and lead technical implementation from design to release. Additionally, you will provide technical leadership and mentoring to the data engineering team. This position offers an opportunity to make a significant impact on Amazons pricing strategies and contribute to the companys continued growth and evolution in the e-commerce space. Design, implement, and maintain data infrastructure for enterprise-wide analytics Extract, transform, and load data from multiple sources using SQL and AWS big data technologies Build comprehensive domain knowledge of Amazons business operations and metrics Write clear, concise documentation and communicate effectively with stakeholders across teams Deliver results independently while meeting deadlines Collaborate with engineering teams to solve complex data challenges Automate reporting processes and develop self-service analytics tools for customers Research and implement new AWS technologies to enhance system capabilities 3+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) Bachelors degree
Posted 1 week ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Amazon s Consumer Payments organization is seeking a highly quantitative, experienced Business Intelligence Engineer to drive the development of analytics and insights. You will succeed in this role if you are an organized self-starter who can learn new technologies quickly and excel in a fast-paced environment. In this position, you will be a key contributor and sparring partner, developing analytics and insights that global executive management teams and business leaders will use to define global strategies and deep dive businesses. Our team offers a unique opportunity to build a new set of analytical experiences from the ground up. You will be part the team that is focused on acquiring new merchants from around the world to payments around the world. The position is based in India but will interact with global leaders and teams in Europe, Japan, US, and other regions. You should be highly analytical, resourceful, customer focused, team oriented, and have an ability to work independently under time constraints to meet deadlines. You will be comfortable thinking big and diving deep. A proven track record in taking on end-to-end ownership and successfully delivering results in a fast-paced, dynamic business environment is strongly preferred. Partnering with engineering, product, business and finance teams to create key performance indicators and new methodologies for measurement Fluency in analytical communication to translate data into actionable insights for stakeholders both in and out of the team, create analytical insights to identify key priorities Proactively make and justify recommendations based on advanced statistical techniques, deep familiarity with the customer or developer experiences, as well as by cross-referencing multiple data sources and comparing against the wider industry Determine best-in-class performance reports and automate reporting for regular metrics, identify areas of opportunity to automate, scale ad-hoc analyses, build and inform BI tool improvements Providing requirements for telemetry and data structure to improve ability to extract data efficiently and provide the team insights faster A day in the life Analyze data and find insights to either drive strategic business decisions or to drive incremental signups or revenue. Define and develop business critical metrics and reports across all international business levers, key performance indicators, and financials. Own alignment and standardization of analytical initiatives across the global business teams Drive efforts across international business leaders, BI leaders and executive management across Europe, Asia and North America. Own key executive reports and metrics that are consumed by our VPs and Directors Provide thought leadership in global business deep dives across a variety of key performance indicators 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets
Posted 1 week ago
3.0 - 5.0 years
3 - 7 Lacs
Pune
Work from Office
Design, develop, test, and deploy scalable Salesforce applications using LWC, Apex, Visualforce, and SOQL/SOSL. Implement and manage Sales Cloud and Service Cloud features and functionalities. Develop and maintain custom Apex classes, triggers, and batch jobs. Integrate Salesforce with external systems using REST/SOAP APIs. Customize standard and custom Salesforce objects, page layouts, and workflows. Collaborate with stakeholders to gather requirements and translate them into technical solutions. Participate in code reviews and follow best practices in code quality, version control, and deployment. Troubleshoot and resolve issues in production and lower environments. Ensure data quality, integrity, and compliance with Salesforce security standards. Required Skills and Experience: 3+ years of experience in Salesforce development. Proficiency in LWC (Lightning Web Components) and Aura Components. Strong experience in Apex classes, triggers, batch processing, and custom metadata types. Solid understanding of Sales Cloud and Service Cloud architecture and capabilities. Hands-on experience integrating with third-party systems using REST and SOAP web services. Strong understanding of Salesforce data model, security model, sharing rules, and role hierarchy. Experience with Salesforce DX, Change Sets, and Version Control (Git) is a plus.
Posted 1 week ago
3.0 - 5.0 years
25 - 30 Lacs
Gurugram
Work from Office
[{"Salary":null , "Posting_Title":"Backend Developer" , "Is_Locked":false , "City":"Gurgaon" , "Industry":"Technology" , "Job_Description":" Develop and maintain backend services using Go (Golang) Build and scale RESTful APIs using the gin-gonic/gin framework Design NoSQL schemas and manage database operations with Firestore Deploy and manage services on Google Cloud Platform , especially Cloud Run and Cloud Storage Implement secure authentication using JWT , OAuth 2.0 , and API security best practices Ensure code quality through version control (Git), testing, and code reviews Use Docker for containerization and manage multi-stage builds Work within Linux environments and utilize basic shell scripting Requirements 3+ years of production experience with Go (Golang) Strong knowledge of the Gin framework for
Posted 1 week ago
2.0 - 18.0 years
12 - 16 Lacs
Hyderabad
Work from Office
Career Category Engineering Job Description [Role Name : IS Architecture] Job Posting Title: Data Architect Workday Job Profile : Principal IS Architect Department Name: Digital, Technology & Innovation Role GCF: 06A ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Architect is a senior-level position responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. This role will manage a team of Data Modelers. Roles & Responsibilities: Provide oversight to data modeling team members. Develop and maintain conceptual logical, and physical data models and to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Evaluate emerging technologies and assess their potential impact. Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: [GCF Level 6A] Doctorate Degree and 2 years of experience in Computer Science, IT or related field OR Master s degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills : Data Modeling: Expert in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Information Governance: Familiarity with policies and procedures for managing information assets, including security, privacy, and compliance. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 1 week ago
4.0 - 9.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.Apexon brings together distinct core competencies in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients toughest technology problems, and a commitment to continuous improvement. Backed by Goldman Sachs Asset Management and Everstone Capital, Apexon now has a global presence of 15 offices (and 10 delivery centers) across four continents. We enable #HumanFirstDigital Job Title: MDM Consultant (Informatica MDM or Reltio) Location: Preferably Bengaluru or any Apexon Location Job Type: Full-time Experience: 5 to 9 years (Minimum 4+ years in MDM implementation) Work Mode: [Hybrid / Remote] Reports To: MDM Lead / Data Architect / Project Manager Job Summary: We are looking for a highly skilled and hands-on MDM Consultant with a strong background in Informatica MDM and/or Reltio MDM to design, implement, and support enterprise Master Data Management solutions. The ideal candidate should have 4+ years of MDM-specific experience, a strong foundation in data modeling, match/merge rules, governance, and integration, and must be flexible and eager to learn other MDM tools as required. Key Responsibilities: Lead or support the development and implementation of MDM solutions using Informatica MDM (Hub/IDD) and/or Reltio MDM. Perform MDM hub configurations, data modeling, match/merge rule tuning, survivorship rule setup, and trust configuration. Configure and enhance IDD (Informatica Data Director) and/or Reltio UI for data stewardship, workflows, and validations. Develop data mappings, transformation logic, and data validations aligned with business rules. Interpret business requirements, define technical architectures, and convert them into robust MDM solutions. Collaborate with business analysts, data stewards, source system owners, and data governance teams. Define and implement data security, role-based access, and audit mechanisms. Participate in use case design, test scenario definition, and support during testing and deployment phases. Work closely with data architects to define and refine data architecture for master data domains. Support performance tuning, incident resolution, and continuous improvement initiatives in MDM environments. Required Skills & Qualifications: Minimum 4+ years of hands-on experience in MDM implementation and development. Strong expertise in Informatica MDM core components (Hub, IDD, SIF APIs). Proficient in SQL and experience with relational databases (Oracle, SQL Server, PostgreSQL, etc.). Solid understanding of MDM concepts like golden record creation, data mastering, survivorship, and hierarchies. Experience in Reltio MDM is a plus; willingness to learn Reltio or other MDM tools is essential. Strong knowledge of match/merge tuning, data stewardship processes, and MDM best practices. Ability to document and communicate technical architectures and standards effectively. Hands-on experience with data integration and ETL tools like Informatica PowerCenter, IICS, or others. Familiarity with data governance principles and experience collaborating with governance teams. Desirable Skills: Experience in cloud-based MDM solutions (e.g., Reltio, Informatica MDM SaaS, Azure/AWS/GCP). Exposure to DevOps/CI-CD practices and tools (Git, Jenkins, etc.). Knowledge of data quality tools (Informatica DQ/IDQ, Talend, etc.). Understanding of Agile/Scrum methodologies. Soft Skills: Strong communication and stakeholder management abilities. Analytical thinker with attention to detail and problem-solving mindset. Collaborative and proactive team player. Flexible and eager to upskill in new technologies and tools in the MDM ecosystem. Our Commitment to Diversity & Inclusion: Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and well-being benefits and assistance. We also offer: o Group Health Insurance covering family of 4 o Term Insurance and Accident Insurance o Paid Holidays & Earned Leaves o Paid Parental LeaveoLearning & Career Development o Employee Wellness Job Location : Bengaluru, India
Posted 1 week ago
5.0 - 10.0 years
7 - 11 Lacs
Chennai
Work from Office
Hiring for Guidewire PC Developer - Chennai We are seeking a skilled Guidewire PolicyCenter Developer with 5+ years in P&C insurance systems and strong expertise in configuration development using Gosu and Java. The ideal candidate will have experience across the full SDLC from requirement gathering and design through testing, production support, and CI/CD. Design, configure, and develop solutions in Guidewire PolicyCenter to meet business requirements. Implement product model, underwriting, rating engine, workflows, PCF/UIs, forms, and integrations (SOAP/REST APIs). Customize and maintain the PolicyCenter data model, Gosu rules, batch processes, and UI pages Integrate with other Guidewire applications (ClaimCenter, BillingCenter) and third-party systems Perform unit and integration testing (GUnit, Mockito) & participate in code reviews Troubleshoot performance, resolve bugs, and support production environments
Posted 1 week ago
6.0 - 11.0 years
8 - 9 Lacs
Bengaluru
Work from Office
Azure Data Engineer - Role Azure Data Engineer Number of Openings 4 No of years experience Minimum 6+ years Detailed job description Primary Responsibilities Design and implement data storage solutions. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure Create data models for analytics purposes Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations Use Azure Data Factory and Databricks to assemble large, complex data sets Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Ensure data security and compliance Optimize Azure SQL databases for efficient query performance Collaborate with data engineers, and other stakeholders to understand requirements and translate them into scalable and reliable data platform architectures Participate in all scrum ceremonies Mandatory Skills Required skills Blend of technical expertise, analytical problem-solving, and collaboration with cross-functional teams Azure DevOps Apache Spark, Python SQL proficiency Data modeling ETL processes Azure Databricks knowledge Familiarity with data warehousing Big data technologies Data governance principles Work Location Offshore Location- PAN India Yrs of Exp-6+Yrs
Posted 1 week ago
5.0 - 8.0 years
12 - 16 Lacs
Gurugram
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Specialist & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Responsibilities Requirements Total Experience 58 years with 4+ years of relevant experience Skills o Proficiency on Databricks platform o Strong handson experience with Pyspark, SQL, and Python o Any cloud Azure, AWS, GCP Certifications (Any of the following) Databricks Certified Associate Developer for Spark 3.0 Preferred Databricks Certified Data Engineer Associate Databricks Certified Data Engineer Professional Mandatory skill sets Data Engineering Preferred skill sets Data Engineering Years of experience required 01 years Education qualification BE Education Degrees/Field of Study required Bachelor of Engineering Degrees/Field of Study preferred Required Skills Data Engineering Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Strategy {+ 22 more} Travel Requirements Available for Work Visa Sponsorship
Posted 1 week ago
0.0 - 1.0 years
2 - 3 Lacs
Mumbai
Work from Office
Key Responsibilities of the candidate include: Data Modeling and Database Design: They create and manage data models, schemas, and database structures to support analytical and operational needs. ETL Processes: Data engineers develop and implement ETL (Extract, Transform, Load) processes to move data from different sources into data warehouses or data lakes. Data Quality and Governance: They implement processes and systems to ensure data accuracy, consistency, and reliability. Collaboration and Communication: Data engineers work closely with data scientists, analysts, and other stakeholders to understand their data needs and deliver solutions. Performance Optimization: They optimize data processing and storage for efficiency and scalability. Data Security and Compliance: Data engineers implement security measures to protect sensitive data and ensure compliance with relevant regulations. Building and Maintaining Data Pipelines: Data engineers design, build, and maintain systems for collecting, processing, and storing data from various sources. Support: Data fixes and support infra provision for post go-live incidents and bug fixes.
Posted 1 week ago
1.0 - 5.0 years
7 - 8 Lacs
Gurugram
Work from Office
Not Applicable Specialism Risk Management Level Associate & Summary . . & Summary Are you looking for a technically challenging role? then we ve one for you. We are looking for a seasoned software engineer to design and execute our platform migration from monolithic to microservice based architecture. In this role you ll / Your main responsibilities You ll be responsible for redesigning the application from present monolithic architecture to microservices based architecture, in the most efficient and scalable way. You ll be owning the application migration from current platform to data driven streaming platform Autonomous, motivated, and selfdriven. A very good team player who can synergize among all relevant stakeholders in the division effectively. Passionate to strive for Customer experience and ontime delivery. An excellent communicator who can have critical conversations with Peers and other relevant stakeholders. articulate and impart knowledge to stakeholders effectively. Accountability, commitment to deliver quality work, ready to embrace challenges. Plans, Prioritize & owns individual & group activities effectively. Mandatory skill sets Hands on experience in Java 8 Hands on experience in designing and developing applications using Spring / Guice Hands on experience in Sprint Boot, Web service (Rest Service), Microservice based Architecture Good understanding of design patterns and should be able to design solutions and algorithms. Experience in migrating monolithic application to microservice will be a plus Experience with NoSQL DBs. Couchbase, MongoDB will be a plus Experience in any Message Queue, Kafka knowledge will be a plus Exposure to OpenShift, Docker + Kubernetes will be a plus Good understanding of NFRs Good understanding of CICD Preferred skill sets Experience in Airline domain is a plus Years of experience required 4 to 9 years of experience in analysis, design , development of software systems in Java Education Qualification Any Education Degrees/Field of Study required Bachelor Degree Degrees/Field of Study preferred Required Skills Java Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Communication, Compliance Auditing, Corporate Governance, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Emotional Regulation, Empathy, Financial Accounting, Financial Audit, Financial Reporting, Financial Statement Analysis, Generally Accepted Accounting Principles (GAAP) {+ 19 more} No
Posted 1 week ago
3.0 - 7.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Not Applicable Specialism SAP Management Level Senior Associate & Summary In SAP supply chain and operations at PwC, you will specialise in providing consulting services for SAP supply chain and operations applications. You will analyse client needs, implement software solutions, and offer training and support for seamless integration and utilisation of SAP supply chain and operations applications. Working in this area, you will enable clients to optimise their supply chain processes, improve operational efficiency, and achieve their strategic objectives. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . A career within SAP Consulting services, will provide you with the opportunity to help our clients maximize the value of their SAP investment with offerings that address sales, finance, supply chain, engineering, and human capital. We provide comprehensive consulting, system integration and implementation services across multiple SAP applications, products and technologies. Simply put, we focus on delivering business led, technology enabled change for our clients including industry specific enterprise resource planning and the latest in mobile, analytics and cloud solutions Responsibilities Collaborate with stakeholders to gather requirements and design SAP PM solutions that enhance plant maintenance processes. Configure and customize SAP PM modules including work order management, preventive maintenance, equipment management, and maintenance planning. Conduct workshops and training sessions to ensure effective utilization of SAP PM functionalities. Provide ongoing support and troubleshooting for SAP PM applications. Perform system testing and validation to ensure quality and performance of SAP PM solutions. Integrate SAP PM modules with other SAP modules (e.g., MM, PP) and thirdparty systems as needed. Develop documentation, including business process flows, user guides, and training materials. Stay updated on SAP PM best practices and emerging technologies to provide innovative solutions. Mandatory skill sets Strong knowledge of SAP PM modules and business processes. Experience with SAP S/4HANA is highly preferred. Proficiency in SAP PM configuration and customization. Excellent problemsolving and analytical skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Certification in SAP PM is a plus. Preferred skill sets Experience with SAP S/4HANA and knowledge of its capabilities related to materials management. Years of experience required 2 4 Yrs experience Education Qualification BE/BTech /MBA/MCA/CAs Education Degrees/Field of Study required Master of Business Administration, Chartered Accountant Diploma, Bachelor of Engineering Degrees/Field of Study preferred Required Skills SAP Plant Maintenance (PM) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Bill of Materials (BOM), Communication, Cost Efficiency, Cost Management, Creativity, DataDriven Insights, Data Modeling, Data Modeling System Support, Demand Forecasting, Demand Planning, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Inventory Management, Lead Time Reduction, Learning Agility, Operational Excellence, Operations Processes, Optimism {+ 22 more} No
Posted 1 week ago
1.0 - 3.0 years
3 - 5 Lacs
Jaipur
Work from Office
About Us: Mathionix Technologies Pvt. Ltd. is a cutting-edge IT solutions provider based in Jaipur, dedicated to delivering innovative technology services and products. We foster a culture of growth, learning, and excellence. The Role We are looking for a highly skilled and self-motivated Full Stack Developer to join our growing tech team. As a Full Stack Developer, you will be responsible for designing, developing and maintaining scalable web applications using modern frontend and backend technologies. This is a remote, full-time position that offers a chance to work on exciting products in a collaborative and innovative environment. Key Responsibilities: Design and implement end-to-end web application features using frontend (React, Angular, etc.) and backend (Node.js, Python, etc.) technologies. Build robust APIs and integrate third-party services and libraries. Work with databases such as MongoDB, MySQL, or PostgreSQL for data modeling and management. Collaborate with UI/UX designers to translate designs into responsive interfaces. Write clean, scalable, and well-documented code. Conduct unit testing and participate in code reviews to ensure code quality. Identify and resolve performance and scalability issues. Stay updated with emerging technologies and best practices. Ideal Profile 1 3 years of hands-on experience as a full stack developer. Proficiency in frontend frameworks (React, Angular, Vue.js) and backend frameworks (Node.js, Express, Django, etc.). Strong understanding of RESTful APIs, version control (Git), and agile workflows. Experience with cloud platforms (AWS, Firebase, etc.) is a plus. Ability to manage time efficiently in a remote setup and deliver quality work independently. Excellent problem-solving and communication skills. Bachelor s degree in Computer Science, Engineering, or related field is preferred. Whats on Offer? Remote Work Flexibility Work from anywhere with a stable internet connection. Career Growth Opportunity to work on real-time industry projects with continuous learning. Team Collaboration Be part of an energetic, supportive, and innovation-driven team. Skill Development Exposure to full product lifecycle and modern tech stacks. Mentorship Regular interaction with senior developers and project leads.
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Pune
Work from Office
JOB DESCRIPTION: Skill Name Intersystem IRIS Tool No of Position 2 Work Location Pune, Mumbai, Hyderabad, Kolkata, Chennai, Bangalore, Gurugram Detailed JD Experience 3 to 5 year A resource with hands-on experience in Intersystem IRIS Data Platform, proficient in Object Script, SQL, and integration technologies (REST, SOAP) . FHIR is must to have. Experience with data modelling, performance tuning, and deploying IRIS on Linux/Windows is required. Skills in Python, Java, .NET, and Docker are a plus. Rounds of interview R1 and R2 (client round if required) Mode of interview (Virtual/ In-person) Virtual Work timing 11AM 8PM Work Mode (Remote/ On-site/ Hybrid) Hybrid ADDITIONAL INFORMATION: Pan Card softcopy and recent passport size photo is mandatory to attached with the resume.
Posted 1 week ago
3.0 - 6.0 years
5 - 8 Lacs
Pune
Work from Office
Job Description: Join Pitney Bowes as a Database Engineer Years of experience: 3-6 years Job Location Noida/Pune Key Skills and Responsibilities Database Architecture : Deep understanding of SQL Server, MongoDB, and MySQL architectures, including components, configurations, and functionalities. Database Cost Management : Knowledge of licensing, resource optimization, and cost-effective scaling strategies for on-premise and cloud-hosted database environment . Database Administration : Proven experience in installing, configuring, maintaining, and securing production-grade database instances. Data Modeling : Strong knowledge of relational and NoSQL data modeling techniques, including normalization, denormalization, and schema design for transactional and analytical workloads. Data Warehousing : Understanding of data warehousing concepts and implementation strategies using tools like SQL Server Analysis Services (SSAS) and MySQL-based warehouses. SQL : Advanced proficiency in T-SQL and MySQL for querying, data manipulation, and performance tuning. Performance Tuning : Expertise in query optimization, indexing strategies, execution plans, and database engine configuration for maximum performance. Security : Experience with database security best practices including user authentication, rolebased access control, data encryption, and auditing. Backup & Recovery : Skilled in setting up automated backup strategies, point-in-time recovery, disaster recovery planning, and high availability solutions (e.g., Always On, Replication). ETL/ELT Processes : Experience with data ingestion and transformation using tools like SQL Server Integration Services (SSIS), custom Python scripts, or other third-party tools. Cloud Platforms : Hands-on experience with deploying and managing databases in cloud environments such as AWS RDS, Azure SQL Database, and MongoDB Atlas. Scripting & Automation : Proficiency in PowerShell, Python, or Bash for task automation, monitoring, and maintenance. Monitoring & Alerting : Familiarity with monitoring tools (e.g., SQL Monitor, Nagios, MongoDB Ops Manager) to track performance, storage, and uptime. Reporting & BI : Experience working with reporting tools such as Power BI, SSRS, or Tableau to deliver actionable insights and dashboards. Project Management : Working knowledge of project management methodologies (Agile, Waterfall) and tools like JIRA or Trello for sprint planning and task tracking. Other Skills Problem-Solving: Ability to troubleshoot and resolve technical issues efficiently. Communication: Effective communication skills to interact with stakeholders and team members. Analytical Thinking: Ability to analyze data and identify patterns and trends. Attention to Detail: Meticulous approach to ensure data accuracy and integrity. Adaptability: Willingness to learn new technologies and adapt to changing requirements. Project Management: Ability to plan, execute, and monitor projects effectively. Leadership: Ability to lead and motivate project teams. Time Management: Effective time management skills to meet project deadlines The Team Our passionate and ambitious team delivers innovations that help clients navigate the complex and always evolving world of commerce: from helping, them use data to market to the best customers, to enabling the sending of parcels and packages efficiently, to securing payments through statements and invoices. Our global Innovation team is dedicated to using the best-in-class tools, processes and modern architectures to create great experiences for our clients. In a rapidly changing world, we have a clear technical vision for our future that includes SaaS, APIs, Big Data, Advanced Analytics, Mobile and the Internet of Things. We are also focused on creating great client experiences, utilizing a Design Thinking platform and approach. Helping clients achieve their greatest commerce potential are Pitney Bowes 14,000+ passionate employees around the world, our relentless pursuit of innovation with over 2,300 active patents, and our focus on clients, who are at the center of all that we do - from small businesses to 90% of the Fortune 500. In everything, we do, we deliver accuracy and precision to drive meaningful impact. Pitney Bowes is an Equal Employment Opportunity/Affirmative Action Employer that values diversity and inclusiveness in the workplace. We will: Provide the will: opportunity to grow and develop your career Offer an inclusive environment that encourages diverse perspectives and ideas Deliver challenging and unique opportunities to contribute to the success of a transforming organization Offer comprehensive benefits globally ( P B Live Well ) Pitney Bowes is an equal opportunity employer that values diversity and inclusiveness in the workplace. All interested individuals must apply online.
Posted 1 week ago
4.0 - 6.0 years
6 - 8 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Key Responsibilities: Develop and maintain front-end applications using React/Angular or Next.js . Design and implement scalable APIs using Python. Work with relational databases for data modeling and querying. Build and integrate Gen AI-powered features (LLMs, embeddings, image/voice processing). Contribute to system design using microservices principles. Collaborate with cross-functional teams in an agile environment. Required Skills: Min 4 to 6 yrs of experience as full stack developer. React / Next.js / Angular Python (FastAPI, Flask, or Django preferred) Relational Databases: SQL, PostgreSQL, or MySQL Experience with Gen AI tools (OpenAI, LangChain, Llamaindex, Pinecone, etc.) Understanding of Microservices and RESTful APIs
Posted 1 week ago
4.0 - 7.0 years
6 - 9 Lacs
Bengaluru
Work from Office
About the Role We re looking for a Solution Engineer to help us build intelligent, task-oriented AI agents that solve real customer problems. You ll work closely with product, engineering, and business teams to translate requirements into agent workflows, integrating tools, APIs, and data into cohesive automation. This role is ideal for someone who can think in terms of agents, workflows, and real-world outcomes- and enjoys turning problem statements into structured, working systems. What You ll Do - Understand customer use cases and translate them into autonomous agent workflows. - Build Python-based agents that integrate with tools, APIs, and internal services. - Design and implement workflows that span multiple steps, tools, and decisions. - Work with PostgreSQL to model customer data and support agent data needs. - Use workflow engines like Temporal (or similar) to orchestrate long-running tasks with retries, checkpoints, and timeouts. - Integrate third-party and internal APIs for agent tool usage and data exchange. - Ensure the workflows and agents are observable, testable, and fault-tolerant. - Collaborate with non-technical stakeholders to validate and iterate on solutions. What We re Looking For Must-Have Skills - 4 - 7 years of experience in backend or solutions engineering. - Problem Solving - Ability to break down complex problems, identify root causes, and implement scalable, effective solutions. - Python : Strong coding practices with emphasis on clarity, modularity, and reusability. - PostgreSQL : Experience in relational data modeling and writing clean, efficient queries. - Communication : Ability to understand business logic and explain solutions clearly to stakeholders. - API Integration : Experience integrating with internal and third-party APIs using REST/GraphQL, including handling authentication, error scenarios, and data transformations.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France