Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
6 - 9 Lacs
Chennai
On-site
Job Description: Job Title – Cloud Data Engineer The Purpose of This Role At Fidelity, we use data and analytics to personalize incredible customer experiences and develop solutions that help our customers live the lives they want. As part of our digital transformation, we have significant investments to build cloud data lake platforms. We are looking for a hands-on data engineer who can help us design and develop our next generation Cloud Data Lake and Analytics Platform for Workplace Solutions. The Value You Deliver Fidelity's Workplace Investing Reporting and Analytics chapter ( India) is seeking a Principal Data Engineer to play a key role in building a Power BI and snowflake-based reporting application. Team is responsible for building reports in Power BI with Snowflake as the source of data for the reports. It will also develop lot of custom capabilities on AWS Team needs to work closely with the Enterprise Data Lake (EDL) team for data acquisition for the reports and with the Oracle Analytics Server (OAS) team to migrate the OAS based reports to Power BI. This person will be self-driven and work with technical partners, and assist developers and testers as needed. The role demands significant collaboration with members of various business and IT groups throughout the lifecycle of a typical project. Our engineering team is innovative, diverse, hardworking, and self-driven. We work in a very dynamic agile environment. The Expertise You Have Bachelor’s degree in Computer Science / similar technical subject area and 3+ years’ experience Hands-on experience in AWS/Azure,EKS/AKS ,DevSec-ops ( Jenkins, GitHub) ,Python experience in Snowflake/ OLAP Databases systems, Hands On experience in Cloud infra automation. The Skills You Bring Working in a team of developers and analysts to deliver business value by coordinating with Architects, Analysts and Product owners Strong collaboration skills Excellent communication skills required Strong problem resolution skills required Critical thinking and the ability to work in an agile environment The Value You Deliver Accountable for consistent delivery of functional software – sprint to sprint, release to release Perfection in software development practices and procedures Participates in application-level architecture – able to drive the solution Develops original and creative technical solutions to ongoing development efforts Responsible for QA readiness of software work you are doing (end-to-end tests, unit tests, automation) Responsible for supporting implementation of initiatives Works on sophisticated assignments and often multiple phases of a project Assists in developing departmental technical policies and procedures The Expertise we’re looking for 3+ years of experience in Data Warehousing, Big data, Analytics and Machine Learning Graduate / Post Graduate Location: Bangalore/Chennai Shift timings : 11:00 am - 8:00pm Certifications: Category: Information Technology
Posted 1 month ago
3.0 years
3 - 5 Lacs
Chennai
On-site
integration Development: Design and implement integration solutions using MuleSoft Anypoint Platform for various enterprise applications, including ERP, CRM, and third-party systems. API Management: Develop and manage APIs using MuleSofts API Gateway, ensuring best practices for API design, security, and monitoring. MuleSoft Anypoint Studio: Develop, deploy, and monitor MuleSoft applications using Anypoint Studio and Anypoint Management Console. Data Transformation: Use MuleSofts DataWeave to transform data between various formats (XML, JSON, CSV, etc.) as part of integration solutions. Troubleshooting and Debugging: Provide support in troubleshooting and resolving integration issues and ensure the solutions are robust and scalable. Collaboration: Work closely with other developers, business analysts, and stakeholders to gather requirements, design, and implement integration solutions. Documentation: Create and maintain technical documentation for the integration solutions, including API specifications, integration architecture, and deployment processes. Best Practices: Ensure that the integrations follow industry best practices and MuleSofts guidelines for designing and implementing scalable and secure solutions. Required Qualifications: Bachelor degree in computer science, Information Technology, or a related field. 3+ years of experience in MuleSoft development and integration projects. Proficiency in MuleSoft Anypoint Platform, including Anypoint Studio, Anypoint Exchange, and Anypoint Management Console. Strong knowledge of API design and management, including REST, SOAP, and Web Services. Proficiency in DataWeave for data transformation. Hands-on experience with integration patterns and technologies such as JMS, HTTP/HTTPS, File, Database, and Cloud integrations. Experience with CI/CD pipelines and deployment tools such as Jenkins, Git, and Maven. Good understanding of cloud platforms (AWS, Azure, or GCP) and how MuleSoft integrates with cloud services. Excellent troubleshooting and problem-solving skills. Strong communication skills and the ability to work effectively in a team environment.Strong working knowledge of modern programming languages, ETL/Data Integration tools (preferably SnapLogic) and understanding of Cloud Concepts. SSL/TLS, SQL, REST, JDBC, JavaScript, JSON Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Good to have the ability to build complex mappings with JSON path expressions, and Python scripting. Good to have experience in ground plex and cloud plex integrations. Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Should be able to deliver the project by leading a team of the 6-8member team. Should have had experience in integration projects with heterogeneous landscapes. Good to have the ability to build complex mappings with JSON path expressions, flat files and cloud. Good to have experience in ground plex and cloud plex integrations. Experience in one or more RDBMS (Oracle, DB2, and SQL Server, PostgreSQL and RedShift) Real-time experience working in OLAP & OLTP database models (Dimensional models). About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
0 years
0 Lacs
Chennai
On-site
Required skills: Strong working knowledge of modern programming languages, ETL/Data Integration tools (preferably SnapLogic) and understanding of Cloud Concepts. SSL/TLS, SQL, REST, JDBC, JavaScript, JSON Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Good to have the ability to build complex mappings with JSON path expressions, and Python scripting. Good to have experience in ground plex and cloud plex integrations. Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Should be able to deliver the project by leading a team of the 6-8member team. Should have had experience in integration projects with heterogeneous landscapes. Good to have the ability to build complex mappings with JSON path expressions, flat files and cloud. Good to have experience in ground plex and cloud plex integrations. Experience in one or more RDBMS (Oracle, DB2, and SQL Server, PostgreSQL and RedShift) Real-time experience working in OLAP & OLTP database models (Dimensional models). About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
3.0 years
6 - 9 Lacs
Bengaluru
On-site
Job Description: Job Title – Cloud Data Engineer The Purpose of This Role At Fidelity, we use data and analytics to personalize incredible customer experiences and develop solutions that help our customers live the lives they want. As part of our digital transformation, we have significant investments to build cloud data lake platforms. We are looking for a hands-on data engineer who can help us design and develop our next generation Cloud Data Lake and Analytics Platform for Workplace Solutions. The Value You Deliver Fidelity's Workplace Investing Reporting and Analytics chapter ( India) is seeking a Principal Data Engineer to play a key role in building a Power BI and snowflake-based reporting application. Team is responsible for building reports in Power BI with Snowflake as the source of data for the reports. It will also develop lot of custom capabilities on AWS Team needs to work closely with the Enterprise Data Lake (EDL) team for data acquisition for the reports and with the Oracle Analytics Server (OAS) team to migrate the OAS based reports to Power BI. This person will be self-driven and work with technical partners, and assist developers and testers as needed. The role demands significant collaboration with members of various business and IT groups throughout the lifecycle of a typical project. Our engineering team is innovative, diverse, hardworking, and self-driven. We work in a very dynamic agile environment. The Expertise You Have Bachelor’s degree in Computer Science / similar technical subject area and 3+ years’ experience Hands-on experience in AWS/Azure,EKS/AKS ,DevSec-ops ( Jenkins, GitHub) ,Python experience in Snowflake/ OLAP Databases systems, Hands On experience in Cloud infra automation. The Skills You Bring Working in a team of developers and analysts to deliver business value by coordinating with Architects, Analysts and Product owners Strong collaboration skills Excellent communication skills required Strong problem resolution skills required Critical thinking and the ability to work in an agile environment The Value You Deliver Accountable for consistent delivery of functional software – sprint to sprint, release to release Perfection in software development practices and procedures Participates in application-level architecture – able to drive the solution Develops original and creative technical solutions to ongoing development efforts Responsible for QA readiness of software work you are doing (end-to-end tests, unit tests, automation) Responsible for supporting implementation of initiatives Works on sophisticated assignments and often multiple phases of a project Assists in developing departmental technical policies and procedures The Expertise we’re looking for 3+ years of experience in Data Warehousing, Big data, Analytics and Machine Learning Graduate / Post Graduate Location: Bangalore/Chennai Shift timings : 11:00 am - 8:00pm Certifications: Category: Information Technology
Posted 1 month ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
integration Development: Design and implement integration solutions using MuleSoft Anypoint Platform for various enterprise applications, including ERP, CRM, and third-party systems. API Management: Develop and manage APIs using MuleSofts API Gateway, ensuring best practices for API design, security, and monitoring. MuleSoft Anypoint Studio: Develop, deploy, and monitor MuleSoft applications using Anypoint Studio and Anypoint Management Console. Data Transformation: Use MuleSofts DataWeave to transform data between various formats (XML, JSON, CSV, etc.) as part of integration solutions. Troubleshooting and Debugging: Provide support in troubleshooting and resolving integration issues and ensure the solutions are robust and scalable. Collaboration: Work closely with other developers, business analysts, and stakeholders to gather requirements, design, and implement integration solutions. Documentation: Create and maintain technical documentation for the integration solutions, including API specifications, integration architecture, and deployment processes. Best Practices: Ensure that the integrations follow industry best practices and MuleSofts guidelines for designing and implementing scalable and secure solutions. Required Qualifications Bachelor degree in computer science, Information Technology, or a related field. 3+ years of experience in MuleSoft development and integration projects. Proficiency in MuleSoft Anypoint Platform, including Anypoint Studio, Anypoint Exchange, and Anypoint Management Console. Strong knowledge of API design and management, including REST, SOAP, and Web Services. Proficiency in DataWeave for data transformation. Hands-on experience with integration patterns and technologies such as JMS, HTTP/HTTPS, File, Database, and Cloud integrations. Experience with CI/CD pipelines and deployment tools such as Jenkins, Git, and Maven. Good understanding of cloud platforms (AWS, Azure, or GCP) and how MuleSoft integrates with cloud services. Excellent troubleshooting and problem-solving skills. Strong communication skills and the ability to work effectively in a team environment.Strong working knowledge of modern programming languages, ETL/Data Integration tools (preferably SnapLogic) and understanding of Cloud Concepts. SSL/TLS, SQL, REST, JDBC, JavaScript, JSON Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Good to have the ability to build complex mappings with JSON path expressions, and Python scripting. Good to have experience in ground plex and cloud plex integrations. Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Should be able to deliver the project by leading a team of the 6-8member team. Should have had experience in integration projects with heterogeneous landscapes. Good to have the ability to build complex mappings with JSON path expressions, flat files and cloud. Good to have experience in ground plex and cloud plex integrations. Experience in one or more RDBMS (Oracle, DB2, and SQL Server, PostgreSQL and RedShift) Real-time experience working in OLAP & OLTP database models (Dimensional models).
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Required Skills Strong working knowledge of modern programming languages, ETL/Data Integration tools (preferably SnapLogic) and understanding of Cloud Concepts. SSL/TLS, SQL, REST, JDBC, JavaScript, JSON Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Good to have the ability to build complex mappings with JSON path expressions, and Python scripting. Good to have experience in ground plex and cloud plex integrations. Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Should be able to deliver the project by leading a team of the 6-8member team. Should have had experience in integration projects with heterogeneous landscapes. Good to have the ability to build complex mappings with JSON path expressions, flat files and cloud. Good to have experience in ground plex and cloud plex integrations. Experience in one or more RDBMS (Oracle, DB2, and SQL Server, PostgreSQL and RedShift) Real-time experience working in OLAP & OLTP database models (Dimensional models).
Posted 1 month ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job description: Our engineering team is growing and we are looking to bring on board a Python Developer who can help us transition to the next phase of the company. You will be pivotal in refining our system architecture, ensuring the various tech stacks play well with each other, and smoothening the DevOps process. A must have well verse understanding of software paradigm, and curiosity to carve out designs of varying ML, MLOps, and LLMOps problem statements. You will determine to lead your team into right direction towards very end of implementation for underlined project. By joining our team, you will get exposure to working across a swath of modern technologies while building an enterprise-grade ML platform in the most promising area. Responsibilities Be the bridge between engineering and product teams. Understand long-term product roadmap and architect a system design that will scale with our plans. Take ownership of converting product insights into detailed engineering requirements. Work break-down among team, and orchestrating the development of components for each sprint. Very well verse with solution designing, and documentation (HLD/LLD). Developing "Zero Defect Software" with extreme efficiency by utilizing modern cutting-edge tools (ChatGPT, Co-pilot etc). Adapt, and impart the mindset to build a unit of software that is secured, instrumented, and resilient. Author high-quality, highly-performance, and unit-tested code running on a distributed environment using containers. Continually evaluate and improve DevOps processes for a cloud-native codebase. Strong design skills in defining API Data Contracts / OOAD / Microservices / Data Models and Concurrency concepts. An ardent leader with an obsession for quality, refinement, innovation, and empowering leadership. Qualifications Work Experience 5-7 years of experience with hands-on experience with development of full fledge Systems/Micro-services using Python 3+ years experience having Senior engineering responsibilities. 3+ years of people mentorship/leadership experience — managing engineers preferably with good exposure in leading multiple development teams. 3+ years of experience in object-oriented design, and agile development methodologies. Basic experience in developing/deploying cloud-native software using GCP / AWS / Azure. Proven track record building large-scale Product grade (high-throughput, low-latency, and scalable) systems. A well-versed understanding and designing skills of SQL/NoSQL/OLAP DBs. Up-to date with modern cutting-edge technologies to boost efficiency and delivery of team. (Bonus: To have an understanding of Generative AI frameworks/Libraries such RAG, Langchain, LLAMAindex etc.) Skills Strong documentation skills. As a team, we heavily rely on elaborate documentation for everything we are working on. Ability to take authoritative decision, and hold accountability. Ability to motivate, lead, and empower others. Strong independent contributor as well as a team player. Working knowledge of ML and familiarity with concepts of MLOps You will excel in this role if You have a product mindset. You understand, care about, and can relate to our customers. You take ownership, collaborate, and follow through to the very end. You love solving difficult problems, stand your ground, and get what you want from engineers. Resonate with our core values of innovation, curiosity, accountability, trust, fun, and social good.
Posted 1 month ago
0 years
0 Lacs
India
On-site
At Board, we power financial and operational planning solutions for the world’s best brands. Thousands of enterprises use our technology to optimize resources, drive growth, and ensure profitability. With advanced analytics and forecasting, plus AI-driven insights, customers transform complex, real-time data into actionable intelligence. What’s been key to our success? Our people—we value everyone’s unique perspective and energy they bring to the organization. We collaborate openly across teams and borders. We embrace a growth mindset to get results. And we celebrate shared success as goals and milestones are achieved. Ready to join a team where innovation meets collaboration? If you're driven by bold ideas and a customer-centric mentality, your next adventure starts here! We are currently looking for a Senior Premium Support Specialist to join our team on an early morning Indian shift (5:30 AM to 2:30 PM IST) . In this role, you will be accountable for providing assistance on a range of Planning Solutions developed for some of Board’s key accounts. The Premium Support team plays a pivotal role in Board’s Customer Success strategy by providing industry leading post-implementation support. Through regular service review meetings, our Support Specialists are expected to maintain a strong grasp of our customers’ ever-changing business and functional requirements whilst helping them understand how Board can be used to achieve their goals. Key Responsibilities and Objectives: Provide qualified functional and technical assistance for existing customer Board planning and reporting solutions. Participate in extensive knowledge transfer processes between delivery and maintenance teams. Be able to articulate, in deep technical detail how Board functionality can be used to meet Customer requirements and find a solution to business problems. Identify areas for improvement in existing applications. Work closely with the Board Product team by relaying Customer and market feedback. Assist Senior Specialists in meetings to provide insights to new features and functionality introduced in the Board Platform. Provide Reactive support for existing customers if/when questions/issues in their existing application arise. Requirements: Educational background in Business, Finance, Accounting, Computer Science, Management Information Systems (MIS), Mathematics or any relevant technical field. Experience with systems like Anaplan, TM1, Oracle, O9, JDA/Blue Yonder, SAP is preferrable. Previous Support or Consulting experience within Supply Chain, FP&A or Retail planning Good understanding of financial processes (Financial Consolidation and Lease Reporting for example) is beneficial. Exposure to multi-dimensional or OLAP technology preferred. Knowledge of SQL advantageous. Great de-escalation skills and capacity to work in very tight time frames. Strong troubleshooting, root-cause analysis and reverse engineering capacity. Ability to grasp elaborate business requirements and translate those into solutions within the Board platform. Excellent written and verbal communication skills. Board International is an equal opportunity employer and is committed to a diverse and inclusive workforce.
Posted 1 month ago
7.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Job Description We are currently looking for a Staff Engineer based in Pune with a proven background in building high scale applications and data processing pipelines to work on building the Real-time Reporting & Analytics pipeline at Zendesk. As a Staff Engineer, you would be a technical leader within the organisation working towards delivering the Realtime Reporting and Analytics vision at Zendesk. The ideal candidate has experience in building large-scale cloud applications preferably in a SaaS environment and is skilled in real-time data pipelines, stream processing and OLAP databases. You should have successfully led technical initiatives, delivered large scale production workloads. Note: : This is a hybrid role, combining remote and on-site work, requiring 3 days in our Pune office. What You'll Be Doing Be responsible for the design, development and maintenance of the Real-time Reporting platform at Zendesk. Work collaboratively with a small, focused and self-organising team to deliver high impact outcomes for Zendesk customers. Mentor and guide engineers, fostering a culture of continuous learning and technical excellence. Translate business needs into technical requirements by engaging effectively with business owners and stakeholders Lead technical initiatives, ensuring delivery of scalable and robust solutions What you bring to the role 7+ years experience working on high scale applications and data processing pipelines. Have a proven experience in both Software Engineering and Data Engineering with a focus on delivering large-scale distributed and high quality applications. Hands-on experience with real-time data pipelines, stream processing and OLAP databases (e.g ClickHouse, Snowflake, Kafka, ElasticSearch etc) Demonstrated ability to lead engineering projects and mentor junior engineers Ability to design scalable and robust data architectures for real-time analytics. Strong communication skills, both written and verbal and the ability to collaborate with teams across multiple time zones globally. Strong proficiency in Java is preferred and hands-on experience with technologies like AWS, Clickhouse, Kafka, Docker and Kubernetes Please note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra. Please refer to the location posted on the requisition for where this role is based. Hybrid: In this role, our hybrid experience is designed at the team level to give you a rich onsite experience packed with connection, collaboration, learning, and celebration - while also giving you flexibility to work remotely for part of the week. This role must attend our local office for part of the week. The specific in-office schedule is to be determined by the hiring manager. The Intelligent Heart Of Customer Experience Zendesk software was built to bring a sense of calm to the chaotic world of customer service. Today we power billions of conversations with brands you know and love. Zendesk believes in offering our people a fulfilling and inclusive experience. Our hybrid way of working, enables us to purposefully come together in person, at one of our many Zendesk offices around the world, to connect, collaborate and learn whilst also giving our people the flexibility to work remotely for part of the week. Zendesk is an equal opportunity employer, and we’re proud of our ongoing efforts to foster global diversity, equity, & inclusion in the workplace. Individuals seeking employment and employees at Zendesk are considered without regard to race, color, religion, national origin, age, sex, gender, gender identity, gender expression, sexual orientation, marital status, medical condition, ancestry, disability, military or veteran status, or any other characteristic protected by applicable law. We are an AA/EEO/Veterans/Disabled employer. If you are based in the United States and would like more information about your EEO rights under the law, please click here. Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans pursuant to applicable federal and state law. If you are an individual with a disability and require a reasonable accommodation to submit this application, complete any pre-employment testing, or otherwise participate in the employee selection process, please send an e-mail to peopleandplaces@zendesk.com with your specific accommodation request.
Posted 1 month ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Experience - 6+ Years Location: Gurgaon (Hybrid) Budget: 15-18 LPA Roles and Responsibilities Formulate automated reports and dashboards using Power BI and other reporting tools. Understand business requirements to set functional specifications for reporting applications. You should be familiar with SSRS and TSQL, Power Query, MDX, PowerBI, and DAX are just a few of the tools and systems on the MS SQL Server BI Stack. Exhibit a foundational understanding of database concepts such relational database architecture, multidimensional database design, and more Design data models that transform raw data into insightful knowledge by understanding business requirements in the context of BI. Develop technical specifications from business needs, and choose a deadline for work completion. Make charts and data documentation thamore.cludes descriptions of the techniques, parameters, models, and relationships. Developing Power BI desktop to create dashboards, KPI scorecards, and visual reports. Establish row-level security on data and comprehend Power BI's application security layer models. Examine, comprehend, and study business needs as they relate to business intelligence. Design and map data models to transform raw data into insightful information. Create dynamic and eye-catching dashboards and reports using Power BI. Make necessary tactical and technological adjustments to enhance current business intelligence systems Integrate data, alter data, and connect to data sources for business intelligence. Requirements and Skills Extremely good communication skills are necessary to effectively explain the requirements between both internal teams and client teams. Exceptional analytical thinking skills for converting data into illuminating reports and reports. BS in computer science or information system along with work experience in a related field knowledge of data warehousing, data gateway, and data preparation projects Working knowledge of Power BI, SSAS, SSRS, and SSIS components of the Microsoft Business Intelligence Stack Articulating, representing, and analyzing solutions with the team while documenting, creating, and modeling them Familiarity with the tools and technologies used by the Microsoft SQL Server BI Stack, including SSRS and TSQL, Power Query, MDX, PowerBI, and DAX. Knowledge of executing DAX queries on the Power BI desktop Comprehensive understanding of data modeling, administration, and visualization Capacity to perform in an atmosphere where agility and continual development are prioritized Detailed knowledge and understanding of database management systems, OLAP, and the ETL (Extract, Transform, Load) framework Awareness of BI technologies (e.g., Microsoft Power BI, Oracle BI) Expertise of SQL queries, SSRS, and SQL Server Integration Services (SSIS) NOTE: Staffing & Recruitment Companies are advised not to contact us.
Posted 1 month ago
5.0 years
0 Lacs
India
Remote
Job Title: Senior BI Developer (Microsoft BI Stack) Location: REMOTE Experience: 5+ years Employment Type: Full-Time Job Summary We are looking for an experienced Senior BI Developer with strong expertise in the Microsoft BI Stack (SSIS, SSRS, SSAS) to join our dynamic team. The ideal candidate will design and develop scalable BI solutions and contribute to strategic decision-making through efficient data modeling, ETL processes, and insightful reporting. Key Responsibilities Design and develop ETL packages using SSIS for data extraction, transformation, and loading from diverse sources. Create and maintain dashboards and reports using SSRS and Power BI (if applicable). Implement and manage OLAP cubes and data models using SSAS (Multidimensional/Tabular). Develop and optimize complex T-SQL queries, stored procedures, and functions. Work closely with business analysts, data engineers, and stakeholders to gather requirements and translate them into technical solutions. Optimize BI solutions for performance and scalability. Lead BI architecture improvements and ensure efficient data flow. Ensure data quality, integrity, and consistency across systems. Mentor and support junior BI developers as needed. Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Systems, or a related field. 5+ years of hands-on experience with the Microsoft BI Stack: SSIS, SSRS, SSAS. Strong knowledge of SQL Server (2016 or later) and advanced T-SQL. Deep understanding of data warehousing concepts, including star/snowflake schemas, and fact/dimension models. Experience with Power BI is a plus. Exposure to Azure Data Services (ADF, Azure SQL, Synapse) is an added advantage. Strong analytical, troubleshooting, and problem-solving skills. Excellent verbal and written communication skills. Why Join Us? Opportunity to work on enterprise-scale BI projects. Supportive work environment with career growth potential. Exposure to modern BI tools and cloud technologies. Skills: azure,power bi,ssis,t-sql,ssrs,data,ssas,sql,sql server,azure data services
Posted 1 month ago
4.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role: ETL Test Engineer Experience range: 4-10 years Location: Hyderabad only Job description: 1.Min 4 to 6 yrs of Exp in ETL Testing. 2.SQL - Expert level of knowledge in core concepts of SQL and query. 3. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 4. Experience in query optimization, stored procedures/views and functions. 5.Strong familiarity of data warehouse projects and data modeling. 6. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 7.Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, PowerBI/Cognos/Informatica, etc.) 8. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 9. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 10.Communication - Good communication skills to understand and collaborate with all the stake holders within the project
Posted 1 month ago
0 years
3 - 6 Lacs
Gurgaon
On-site
Required skills: Strong working knowledge of modern programming languages, ETL/Data Integration tools (preferably SnapLogic) and understanding of Cloud Concepts. SSL/TLS, SQL, REST, JDBC, JavaScript, JSON Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Good to have the ability to build complex mappings with JSON path expressions, and Python scripting. Good to have experience in ground plex and cloud plex integrations. Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Should be able to deliver the project by leading a team of the 6-8member team. Should have had experience in integration projects with heterogeneous landscapes. Good to have the ability to build complex mappings with JSON path expressions, flat files and cloud. Good to have experience in ground plex and cloud plex integrations. Experience in one or more RDBMS (Oracle, DB2, and SQL Server, PostgreSQL and RedShift) Real-time experience working in OLAP & OLTP database models (Dimensional models). About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Company Overview Docusign brings agreements to life. Over 1.5 million customers and more than a billion people in over 180 countries use Docusign solutions to accelerate the process of doing business and simplify people’s lives. With intelligent agreement management, Docusign unleashes business-critical data that is trapped inside of documents. Until now, these were disconnected from business systems of record, costing businesses time, money, and opportunity. Using Docusign’s Intelligent Agreement Management platform, companies can create, commit, and manage agreements with solutions created by the #1 company in e-signature and contract lifecycle management (CLM). What you'll do DocuSign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) T eam, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive “can do” attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This position is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues in a timely manner Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Evaluate and improve data architecture and processes continuously Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player Job Designation Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation) Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a position's job designation depending on business needs and as permitted by local law. What you bring Basic Bachelor’s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 8+ years dimensional and relational data modeling experience Excellent SQL and database management skills Preferred 8+ years in data warehouse engineering (OLAP) Snowflake, BigQuery, Teradata 8+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 8+ years with big data, Hadoop, Data Lake, Spark in a cloud environment(AWS) 8+ years with commercial ETL tools - DBT, Matillion etc 8+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, master data management(MDM), sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills Life at Docusign Working here Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what’s right, every day. At Docusign, everything is equal. We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you’ll be loved by us, our customers, and the world in which we live. Accommodation Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need such an accommodation, or a religious accommodation, during the application process, please contact us at accommodations@docusign.com. If you experience any issues, concerns, or technical difficulties during the application process please get in touch with our Talent organization at taops@docusign.com for assistance. Applicant and Candidate Privacy Notice
Posted 1 month ago
0 years
2 - 5 Lacs
Mumbai
On-site
Global Finance Analyst Power BI – Analysis & Insight Lloyd’s Register Location: - Mumbai, India What we’re looking for Convert financial data into informative visual reports and dashboards that help inform decision making What we offer you The opportunity to work for an organization that has a strong sense of purpose, is value driven and helps colleagues to develop professionally and personally through our range of people development programmes. A Full-time permanent role. The role Build automated reports and dashboards with the help of Power BI and other reporting tools. Extract data from various sources to transform raw data into meaningful insights to support Senior leadership teams, Executive Leadership Teams and the FP&A leads. Develop models/reports, delivering the desired data visualisation and Business analytics results to support decision making. Support FP&A ad hoc analysis What you bring Qualified accountant (ACA or CIMA) and currently operating at a senior finance level in a global organisation Able to perform at the highest levels whilst also demonstrating the ability to be hands on when required. The appointee will measure their success by results and will have the resilience and maturity to manage internal relationships in an organisation going through rapid change. Experience of international multi-site and multi-currency organisations Experience in handling data preparation – collection (from various sources), organising, cleaning data to extract valuable Insights. Data modelling experience and understanding of different technologies such as OLAP, statistical analysis, computer science algorithms, databases etc Knowledge & Experience working with Business Intelligence tools and systems like SAP, Power BI, Tableau, etc. preferably complimented by associated skills such as SQL, Power Query, DAX, Python, R etc. Experience of international multi-site commercial/operational activity Ability to drill down and visualize data in the best possible way using charts, reports, or dashboards generated using Power BI Ability to understand and assess complex and sometimes unfamiliar situations, visualise solutions and see through to resolution and work effectively within a matrix organisation. Ability to work successfully within a Finance Shared Service Centre mode Good attention to detail with the keen eye for errors and flaws in the data to help LR work with the cleanest most accurate data. Strong communication skills You are someone who: Is keen to take accountability and ownership for delivering customer needs Can self-manage and prioritize tasks towards achieving goals. Is effective at solving problems, troubleshooting and making timely decisions Is flexible and eager to take initiatives. Communicates in a structured way and has ability to present technical ideas in user-friendly language. Displays a team spirit, particularly in a multicultural environment. Responds positively to learning opportunities and is comfortable stepping out of own comfort zone. #LI-KC1 #LI-Hybrid
Posted 1 month ago
4.0 - 9.0 years
20 - 30 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 5-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only Outstation candidate's will not be considered
Posted 1 month ago
7.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
About The Role We are seeking an experienced Senior Backend Developer with strong expertise in Java, Spring framework, and high availability service design. This role will be pivotal in designing, developing, and optimizing robust backend systems that power our index and product generation platforms while providing technical leadership within the team. You'll be joining a dynamic team focused on solving complex challenges in delivering near real-time financial data with high throughput and resiliency requirements. About The Team This is an excellent opportunity to join the Index IT team, as part of a delivery-focused IT group responsible for designing, developing and supporting internal, client and public-facing distribution solutions. If selected, you will work as part of a delivery focused and talented software development team responsible for designing, developing and supporting the index and product generation platforms. You will use cutting edge software development techniques and technologies, following the best practices of the industry. Our team solves challenging problems around delivering near real-time financial data, working with large flexible schemas and building database systems that provide exceptional throughput and resiliency. We leverage the latest technologies including Kubernetes, continuous integration/deployment pipelines, and build highly observable applications. MSCI provides a very attractive compensation package, an exciting work environment and opportunities for continuous self-development and career advancement for the right candidates. Key Responsibilities Design, develop, and maintain scalable, high-performance backend applications using Java and Spring framework Lead the architecture and implementation of complex API services that interact with high availability database systems Develop solutions for processing and delivering near real-time financial data streams Design flexible schemas that can accommodate evolving financial data requirements Collaborate closely with product managers, business analysts, and other developers to translate business requirements into technical solutions Design and optimize OLAP database interactions for analytical performance and high availability Implement observable applications with comprehensive monitoring and logging Design and develop RESTful APIs following industry best practices Lead code reviews and mentor junior developers on team best practices Participate in the full software development lifecycle from requirements analysis through deployment Troubleshoot and resolve complex production issues in high-throughput systems Evaluate and recommend new technologies and approaches to improve system performance and developer productivity Contribute to technical documentation and system design specifications Preferred Qualifications Master's degree in Computer Science, Software Engineering, or related field Experience with Kubernetes and containerized application deployment Experience with observability frameworks such as OpenTelemetry (OTEL) Proficiency with continuous integration and deployment methodologies (CI/CD) Knowledge of cloud platforms (AWS, Azure, or GCP) Experience with microservices architecture Experience with containerization technologies (Docker) Understanding of DevOps practices Experience with message brokers (Kafka, RabbitMQ) Background in agile development methodologies Experience with test-driven development and automated testing frameworks Familiarity with financial data models and structures Background in financial services or experience with financial data Required Qualifications Bachelor's degree in Computer Science, Information Technology, or related field 7+ years of professional experience in backend software development 5+ years of experience with Java programming and core Java concepts 3+ years of experience with Spring framework (Spring Boot, Spring MVC, Spring Data) Familiarity with OLAP concepts and high availability database design principles Experience building systems that handle large data volumes with high throughput requirements Proficiency in SQL and database optimization techniques Experience with RESTful API design and implementation Solid understanding of design patterns and object-oriented programming Experience with version control systems (Git) Strong problem-solving skills and attention to detail Excellent communication skills to collaborate effectively across teams and explain technical concepts to non-technical stakeholders What We Offer You Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose - to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 1 month ago
5.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Introduction A career in IBM Consulting embraces long-term relationships and close collaboration with clients across the globe. In this role, you will work for IBM BPO, part of Consulting that, accelerates digital transformation using agile methodologies, process mining, and AI-powered workflows. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio, including IBM Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be supported by mentors and coaches who will encourage you to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in groundbreaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and learning opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred Education Master's Degree Required Technical And Professional Expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Preferred Technical And Professional Experience Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Hyderabad
Work from Office
About the Role We are hiring a Staff Data Engineer to join our India Operations and play a crucial role in our mission to establish a world-class data engineering team within the Center for Data and Insights (CDI). Reporting directly to the Director of Data Engineering, you will be a key contributor, advancing our data engineering capabilities in the AWS and GCP ecosystems. Your responsibilities include collaborating with key stakeholders, guiding and mentoring fellow data engineers, and working hands-on in various domains such as data architecture, data lake infrastructure, data and ML job orchestration. Your contributions will ensure the consistency and reliability of data and insights, aligning with our objective of enabling well-informed decision-making. The ideal candidate will demonstrate an empathetic and service-oriented approach, fostering a thriving data and insights culture while enhancing and safeguarding our data infrastructure. This role presents a unique opportunity to build and strengthen our data engineering platforms at a global level. If you are an experienced professional with a passion for impactful data engineering initiatives and a commitment to driving transformative changes, we encourage you to explore this role. Joining us as a Staff Data Engineer allows you to significantly contribute to the trajectory of our CDI, making a lasting impact on our data-centric aspirations as we aim for new heights. Core Areas of Responsibility Implement robust data infrastructure, platforms, and solutions. Collaborate effectively with cross functional teams & CDI leaders, by ensuring the delivery of timely data load and jobs tailored to their unique needs. Guide & mentor the team of skilled data engineers, by prioritizing a service-oriented approach and quick response times. Advocate for the enhancement, and adherence to high data quality standards, KPI certification methods, and engineering best practices. Approach reporting platforms and analytical processes with innovative thinking, considering the evolving demands of the business. Implement the strategy for migrating from AWS to GCP with near real time events, machine learning pipelines using our customer data platform (Segment) and purpose built pipelines and DBs to activate systems of intelligence. Continuously improve reporting workflows and efficiency, harnessing the power of automation whenever feasible. Enhance the performance, reliability, and scalability of storage and compute layers of the data lake. About You We get excited about candidates, like you, because... 8+ years of hands-on experience in data engineering and/or software development. Highly skilled in programming languages like Python, Spark & SQL Comfortable using BI tools like Tableau, Looker, Preset, and so on Proficient in utilizing event data collection tools such as Snowplow, Segment, Google Tag Manager, Tealium, mParticle, and more. Comprehensive expertise across the entire lifecycle of implementing compute and orchestration tools like Databricks, Airflow, Talend, and others. Skilled in working with streaming OLAP engines like Druid, ClickHouse, and similar technologies. Experience leveraging AWS services including EMR Spark, Redshift, Kinesis, Lambda, Glue, S3, and Athena, among others. Nice to have exposure to GCP services like BigQuery, Google Storage, Looker, Google Analytics, and so on Good understanding of building real-time data systems as well AI/ML personalization products Experience with Customer Data Platforms (CDPs) and Data Management Platforms (DMPs), contributing to holistic data strategies. Familiarity with high-security environments like HIPAA, PCI, or similar contexts, highlighting a commitment to data privacy and security. Accomplished in managing large-scale data sets, handling Terabytes of data and billions of records effectively. You holds a Bachelors degree in Computer Science, Information Systems, or a related field, providing a strong foundational knowledge
Posted 1 month ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Company Overview Docusign brings agreements to life. Over 1.5 million customers and more than a billion people in over 180 countries use Docusign solutions to accelerate the process of doing business and simplify people’s lives. With intelligent agreement management, Docusign unleashes business-critical data that is trapped inside of documents. Until now, these were disconnected from business systems of record, costing businesses time, money, and opportunity. Using Docusign’s Intelligent Agreement Management platform, companies can create, commit, and manage agreements with solutions created by the #1 company in e-signature and contract lifecycle management (CLM). What you'll do Docusign is seeking a talented and results oriented Data Engineer to focus on delivering trusted data to the business. As a member of the Global Data Analytics (GDA) Team, the Data Engineer leverages a variety of technologies to design, develop and deliver new features in addition to loading, transforming and preparing data sets of all shapes and sizes for teams around the world. During a typical day, the Engineer will spend time developing new features to analyze data, develop solutions and load tested data sets into the Snowflake Enterprise Data Warehouse. The ideal candidate will demonstrate a positive “can do” attitude, a passion for learning and growing, and the drive to work hard and get the job done in a timely fashion. This individual contributor position provides plenty of room to grow -- a mix of challenging assignments, a chance to work with a world-class team, and the opportunity to use innovative technologies such as AWS, Snowflake, dbt, Airflow and Matillion. This position is an individual contributor role reporting to the Manager, Data Engineering. Responsibility Design, develop and maintain scalable and efficient data pipelines Analyze and Develop data quality and validation procedures. Work with stakeholders to understand the data requirements and provide solutions Troubleshoot and resolve data issues in a timely manner Learn and leverage available AI tools for increased developer productivity Collaborate with cross-functional teams to ingest data from various sources Evaluate and improve data architecture and processes continuously Own, monitor, and improve solutions to ensure SLAs are met Develop and maintain documentation for Data infrastructure and processes Executes projects using Agile Scrum methodologies and be a team player Job Designation Hybrid: Employee divides their time between in-office and remote work. Access to an office location is required. (Frequency: Minimum 2 days per week; may vary by team but will be weekly in-office expectation) Positions at Docusign are assigned a job designation of either In Office, Hybrid or Remote and are specific to the role/job. Preferred job designations are not guaranteed when changing positions within Docusign. Docusign reserves the right to change a position's job designation depending on business needs and as permitted by local law. What you bring Basic Bachelor’s Degree in Computer Science, Data Analytics, Information Systems, etc Experience developing data pipelines in one of the following languages: Python or Java 5+ years dimensional and relational data modeling experience Excellent SQL and database management skills Preferred 5+ years in data warehouse engineering (OLAP) Snowflake, BigQuery, Teradata, Redshift 5+ years with transactional databases (OLTP) Oracle, SQL Server, MySQL 5+ years with big data, Hadoop, Data Lake, Spark in a cloud environment(AWS) 5+ years with commercial ETL tools - DBT, Matillion etc 5+ years delivering ETL solutions from source systems, databases, APIs, flat-files, JSON Experience developing Entity Relationship Diagrams with Erwin, SQLDBM, or equivalent Experience working with job scheduling and monitoring systems (Airflow, Datadog, AWS SNS) Familiarity with Gen AI tools like Git Copilot and dbt copilot. Good understanding of Gen AI Application frameworks. Knowledge on any agentic platforms Experience building BI Dashboards with tools like Tableau Experience in the financial domain, sales and marketing, accounts payable, accounts receivable, invoicing Experience managing work assignments using tools like Jira and Confluence Experience with Scrum/Agile methodologies Ability to work independently and as part of a team Excellent analytical and problem solving and communication skills Life at Docusign Working here Docusign is committed to building trust and making the world more agreeable for our employees, customers and the communities in which we live and work. You can count on us to listen, be honest, and try our best to do what’s right, every day. At Docusign, everything is equal. We each have a responsibility to ensure every team member has an equal opportunity to succeed, to be heard, to exchange ideas openly, to build lasting relationships, and to do the work of their life. Best of all, you will be able to feel deep pride in the work you do, because your contribution helps us make the world better than we found it. And for that, you’ll be loved by us, our customers, and the world in which we live. Accommodation Docusign is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need such an accommodation, or a religious accommodation, during the application process, please contact us at accommodations@docusign.com. If you experience any issues, concerns, or technical difficulties during the application process please get in touch with our Talent organization at taops@docusign.com for assistance. Applicant and Candidate Privacy Notice
Posted 1 month ago
2.0 - 3.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Total Experience Expected: 2-3 years Looking for an OneStream consultant with hands-on implementation experience. Skills : Should be proficient in building workflows, dashboards, and cube views. Experience in financial consolidation, planning models, and system integrations is a strong plus. Excellent analytical, problem-solving, and communication skills. Ability to work independently and collaboratively in a fast-paced environment. Communication Skills, Workflow, Analytical & Problem Solving, Cube, Dashboards, Financial Consolidation
Posted 1 month ago
0 years
6 - 8 Lacs
Hyderābād
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Assistant Vice President - Senior Data Modeler – Retail Banking. We are seeking a seasoned Senior Data Modeler with deep expertise in Retail Banking to provide thought leadership in data modeling and architecture. The ideal candidate will have extensive experience in designing scalable, future-proof data models that support key banking functions such as customer accounts, deposits, loans, credit cards, transactions, and payments. This role requires strong domain expertise, ensuring data models align with operational efficiency, analytics, and automation while adhering to industry regulations. Key Responsibilities Deep Domain Expertise: § Develop conceptual, logical, and physical data models for core Retail Banking processes, including customer onboarding, account management, deposits, lending, payments, risk management, and fraud detection. § Align data models with evolving business needs to support real-time and batch processing for transactional (OLTP) and analytical (OLAP) systems. § Bridge business processes with data architecture, ensuring models enhance operational efficiency and enable advanced analytics. Collaboration & Leadership: § Collaborate with key Retail Banking stakeholder groups including decision-makers, analysts, operational users, and compliance teams. § Engage with these personas to gather and interpret data requirements, ensuring models are tailored to drive actionable insights and support effective decision-making. § Provide technical guidance and mentorship to junior data modelers and analysts. § Facilitate review sessions and present model designs to stakeholders. Compliance & Regulatory Considerations: § Ensure data models adhere to industry regulations, including Basel III, GDPR, CCPA, PCI-DSS, KYC, AML (Anti-Money Laundering), and SOX (Sarbanes-Oxley). § Incorporate data governance principles to ensure auditability, security, and regulatory reporting compliance. Technical Excellence: § Utilize industry-standard data modeling tools (e.g., Erwin, PowerDesigner, or equivalent) to develop and maintain models. § Stay current with emerging trends and technologies in data management, data warehousing, and analytics. § Collaborate with ETL developers and database administrators to ensure seamless data integration and performance optimization. Process Improvement: § Identify opportunities for process improvements and automation within business data workflows. § Support continuous improvement initiatives to streamline business operations and reporting. Qualifications we seek in you! Minimum Qualifications Bachelor’s degree in business information systems (IS), computer science or related field, or equivalent-related IT experience Extensive experience in data modeling and data architecture, specifically in the Wealth Management domain. Prior experience working closely with various diverse stakeholder groups to capture detailed data requirements Proven expertise in designing and managing data models for wealth management processes (portfolio management, risk analysis, client segmentation, asset allocation, etc.). Preferred Qualifications/ Skills Strong hands-on experience in relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Expertise in data modelling principles/methods including conceptual, logical & physical data models. Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Strong knowledge of data modelling, and related tools (Erwin or ER Studio or PowerDesigner or others) required. Strong understanding of Wealth Management business processes, including portfolio management, risk analysis, client segmentation, asset allocation, etc. Ability to clearly translate complex data models into actionable business insights. Excellent analytical, problem-solving, and communication skills. Strong interpersonal skills and the ability to work collaboratively with both technical and business teams. Knowledge of compliance and regulatory requirements (e.g., SEC, FINRA, GDPR/CCPA). Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Assistant Vice President Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 19, 2025, 6:13:22 AM Unposting Date Dec 16, 2025, 10:13:22 AM Master Skills List Digital Job Category Full Time
Posted 1 month ago
8.0 - 10.0 years
7 - 8 Lacs
Hyderābād
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. Role Description: We are seeking an experienced Senior Manager, Data Engineering to lead and scale a strong team of data engineers. This role blends technical depth with strategic oversight and people leadership. The ideal candidate will oversee the execution of data engineering initiatives, collaborate with business analysts and multi-functional teams, manage resource capacity, and ensure delivery aligned to business priorities. In addition to technical competence, the candidate will be adept at managing agile operations and driving continuous improvement. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, leveraging AWS or other preferred platforms. Lead and motivate a strong data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. Lead and manage a team of data engineers, ensuring appropriate workload distribution, goal alignment, and performance management. Work closely with business analysts and product collaborators to prioritize and align engineering output with business objectives. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 8 to 10 years of computer science and engineering preferred, other Engineering field is considered OR Bachelor’s degree and 10 to 14 years of computer science and engineering preferred, other Engineering field is considered OR Diploma and 14 to 18 years of computer science and engineering preferred, other Engineering field is considered Demonstrated proficiency in using cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop strong data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Strong communication skills for collaborating with business and technical teams alike. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Professional Certification: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 month ago
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Job ID:[[id]] Global Finance Analyst Power BI – Analysis & Insight Lloyd’s Register Location: - Mumbai, India What We’re Looking For Convert financial data into informative visual reports and dashboards that help inform decision making What We Offer You The opportunity to work for an organization that has a strong sense of purpose, is value driven and helps colleagues to develop professionally and personally through our range of people development programmes. A Full-time permanent role. The role Build automated reports and dashboards with the help of Power BI and other reporting tools. Extract data from various sources to transform raw data into meaningful insights to support Senior leadership teams, Executive Leadership Teams and the FP&A leads. Develop models/reports, delivering the desired data visualisation and Business analytics results to support decision making. Support FP&A ad hoc analysis What You Bring Qualified accountant (ACA or CIMA) and currently operating at a senior finance level in a global organisation Able to perform at the highest levels whilst also demonstrating the ability to be hands on when required. The appointee will measure their success by results and will have the resilience and maturity to manage internal relationships in an organisation going through rapid change. Experience of international multi-site and multi-currency organisations Experience in handling data preparation – collection (from various sources), organising, cleaning data to extract valuable Insights. Data modelling experience and understanding of different technologies such as OLAP, statistical analysis, computer science algorithms, databases etc Knowledge & Experience working with Business Intelligence tools and systems like SAP, Power BI, Tableau, etc. preferably complimented by associated skills such as SQL, Power Query, DAX, Python, R etc. Experience of international multi-site commercial/operational activity Ability to drill down and visualize data in the best possible way using charts, reports, or dashboards generated using Power BI Ability to understand and assess complex and sometimes unfamiliar situations, visualise solutions and see through to resolution and work effectively within a matrix organisation. Ability to work successfully within a Finance Shared Service Centre mode Good attention to detail with the keen eye for errors and flaws in the data to help LR work with the cleanest most accurate data. Strong communication skills You are someone who: Is keen to take accountability and ownership for delivering customer needs Can self-manage and prioritize tasks towards achieving goals. Is effective at solving problems, troubleshooting and making timely decisions Is flexible and eager to take initiatives. Communicates in a structured way and has ability to present technical ideas in user-friendly language. Displays a team spirit, particularly in a multicultural environment. Responds positively to learning opportunities and is comfortable stepping out of own comfort zone. About Us We are a leading international technical professional service provider and a leader in classification, compliance, and consultancy services to the marine and offshore industry, a trusted advisor to our customers helping to design, construct and operate their assets to the highest levels of safety and performance. We are shaping the industry’s future through the development of novel and innovative technology for the next generation of assets, while continuing to deliver solutions for our customers every day. Be a part of Lloyd’s Register is wholly owned by the Lloyd’s Register Foundation, a politically and financially independent global charity that aims to engineer a safer world through promoting safety and education. For a thriving ocean economy, Lloyd’s Register colleagues and Lloyd’s Register Foundation work together to fund research, foster industry collaboration and develop action-oriented solutions to make the world a safer place. Want to apply. Here at Lloyd’s Register, we care, we share and we do the right thing in every situation. It’s ingrained in our culture and everything we do. We are committed, and continually strive, to lead with our values that empower and enable an inclusive environment conducive to your growth, development and engagement. It doesn’t matter who you are, what you have experienced, how you identify, how old you are, where you are from, what your beliefs are or how your brain or body works – the diversity of our colleagues is fundamental to our futures and the changes we can make together. Our inclusive culture allows us to connect together authentically and to be courageous and bold. We don’t just talk about our differences, we celebrate them! We are committed to making all stages of our recruitment process accessible to all candidates. Please let us know if you need any assistance or reasonable adjustments throughout your application and we will do everything we possibly can to support you. If you don't tick every box in these ads, please don't rule yourself out. We focus on hiring people who share our goal of working together for a safer, sustainable, thriving ocean economy. We care, we share, we do the right thing. If you have further questions about this role, please contact us at careers@lr.org and we will respond to you as soon as possible. Diversity and Inclusion at Lloyd's Register: Together we are one Lloyd’s Register, committed to developing an inclusive and safe workplace that embraces and celebrates diversity. We strive to ensure that all applicants to LR experience equality of opportunity and fair treatment, because we believe it is the right thing to do. We hope you do too. As a Disability Confident Committed Employer, we have committed to: ensure our recruitment process is inclusive and accessible. communicating and promoting vacancies offering an interview to disabled people who meet the minimum criteria for the job. anticipating and providing reasonable adjustments as required supporting any existing employee who acquires a disability or long-term health condition, enabling them to stay in work. at least one activity that will make a difference for disabled people. Find out more about Disability Confident at: www.gov.uk/disability-confident Copyright © Lloyd's Register 2024. All rights reserved. Terms of use. Privacy policy. The Lloyd's Register Group comprises charities and non-charitable companies, with the latter supporting the charities in their main goal of enhancing the safety of life and property, at sea, on land and in the air - for the benefit of the public and the environment. (Group entities).
Posted 1 month ago
13.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Principal Software Engineer The Platforms organization at Fanatics is at the heart of our company's data-driven decision making, building foundational capabilities that empower our application/data engineers and data scientists to unlock the power of data. We work relentlessly on enhancing the fan experience With Exciting Projects Across a Diverse Landscape, Including Commerce Applications: Architecting and designing D2C and B2B commerce applications with workloads distributed across a multi-cloud topology to engage fans around the world. Storage Infrastructure: Building and managing scalable and reliable data storage solutions. Streaming Data Processing: Handling real-time data pipelines with high throughput and low latency. Data & Workflow Orchestration: Coordinating complex data processing workflows with efficiency and reliability. Messaging Infrastructure: Ensuring secure and efficient communication between applications. Big Data Processing: Analyzing massive datasets with speed and accuracy. Data Warehouse: Providing a centralized and accessible repository for historical data. Real Time OLAP Databases: Enabling fast and interactive data analysis for insights on the fly. AI & ML Platforms: Building and maintaining a robust platform that supports the development and deployment of impactful ML models to power applications in areas such as recommender systems and inventory intelligence The Opportunity We are seeking a passionate and experienced Principal Engineer to play a key role in shaping the future of our application, cloud, and data platforms at Fanatics. As a technical leader in the organization, you will be responsible for driving technical innovation, leading large initiatives, and mentoring junior engineers. You will have the opportunity to contribute to building scalable solutions that will empower our entire company to make data-driven decisions and operate more effectively. Responsibilities Design and drive the technical roadmap for the evolution of our platforms, ensuring they are scalable, reliable, and meet the evolving needs of the business. Lead large initiatives within the broader Fanatics tech org, collaborating effectively with cross-functional teams including engineering, data science, and product management. Provide mentorship and guidance to junior engineers, fostering their growth and development within the team. Build data platforms that promote standardization, including data pipeline development, platform tooling, data lake formatting, and data democratization. Build and maintain the AI/ML infrastructure to support Fanatics' AI/ML needs, with a focus on standardized MLOps practices, accelerating the adoption and deployment of impactful AI/ML applications across the company. Champion data and AI governance best practices, establishing and enforcing data processing principles, design patterns, and practices. Build strong cross-functional partnerships with teams across the organization, influencing them to adopt best practices and collaborate effectively. Qualifications 13+ years of experience leading the development of modern cloud-based applications and their integration into a common data platform (or data mesh) to enable business intelligence and optimization. Deep technical understanding of distributed systems architecture and the integration of operational systems with analytical systems to enable anomaly detection, business process mining, and optimization at scale. Strong expertise in the big data ecosystem, including tools like Apache Kafka, Spark, Iceberg, Airflow, AWS S3, data modeling, data warehouses, OLAP databases, etc. Proven experience in data processing, orchestration, data engineering, data quality management, and data governance. Excellent communication skills, with the ability to collaborate effectively across teams and provide clear and concise technical guidance. Experience with AI/ML platforms and a working knowledge of how data scientists leverage data for AI is a strong plus About Us Fanatics is building a leading global digital sports platform. We ignite the passions of global sports fans and maximize the presence and reach for our hundreds of sports partners globally by offering products and services across Fanatics Commerce, Fanatics Collectibles, and Fanatics Betting & Gaming, allowing sports fans to Buy, Collect, and Bet. Through the Fanatics platform, sports fans can buy licensed fan gear, jerseys, lifestyle and streetwear products, headwear, and hardgoods; collect physical and digital trading cards, sports memorabilia, and other digital assets; and bet as the company builds its Sportsbook and iGaming platform. Fanatics has an established database of over 100 million global sports fans; a global partner network with approximately 900 sports properties, including major national and international professional sports leagues, players associations, teams, colleges, college conferences and retail partners, 2,500 athletes and celebrities, and 200 exclusive athletes; and over 2,000 retail locations, including its Lids retail stores. Our more than 22,000 employees are committed to relentlessly enhancing the fan experience and delighting sports fans globally.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France