Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
On-site
Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description 4 - 6 years of experience as a Python Developer with a strong understanding of Python programming concepts and best practice Bachelor’s Degree/B.Tech/B.E in Computer Science or a related discipline Design, develop, and maintain robust and scalable Python-based applications, tools, and frameworks that integrate machine learning models and algorithms Demonstrated expertise in developing machine learning solutions, including feature selection, model training, and evaluation Proficiency in data manipulation libraries (e.g., Pandas, NumPy) and machine learning frameworks (e.g., Scikit-learn, TensorFlow, PyTorch, Keras) Experience with web frameworks like Django or Flask Contribute to the architecture and design of data-driven solutions, ensuring they meet both functional and non-functional requirements Experience with databases such as MS-SQL Server, PostgreSQL or MySQL. Solid knowledge of OLTP and OLAP concepts Experience with CI/CD tooling (at least Git and Jenkins) Experience with the Agile/Scrum/Kanban way of working Self-motivated and hard-working Knowledge of performance testing frameworks including Mocha and Jest. Knowledge of RESTful APIs. Understanding of AWS and Azure Cloud services Experience with chatbot and NLU / NLP based application is required Qualifications Bachelor’s Degree/B.Tech/B.E in Computer Science or a related discipline
Posted 2 weeks ago
8.0 years
21 - 23 Lacs
Pune, Maharashtra, India
On-site
We're Hiring: Power BI Architect Experience Required: 8+ Years Job Type: Full-Time Responsibilities ∙ Understand business requirements in the BI context and design data models to transform raw data into meaningful insights ∙ Create dashboards and interactive visual reports using Power BI ∙ Identify key performance indicators (KPIs) with clear objectives and consistently monitor them ∙ Analyze data and present it through reports that support decision-making ∙ Convert business requirements into technical specifications and set timelines for delivery ∙ Create relationships between data and develop tabular/multidimensional models ∙ Document charts, algorithms, parameters, models, and relationships ∙ Demonstrate proficiency in Analysis Services – building Tabular & Multidimensional models (OLAP, Cubes) over DW/DM/DB ∙ Write complex DAX and MDX queries ∙ Design and guide BI-related IT architecture in existing or new landscapes, ensuring compliance with standards ∙ Provide blueprinting, gather requirements, and roll out BI solutions to end-users ∙ Design conceptual, logical, and physical data models ∙ Design, develop, test, and deploy Power BI scripts with advanced analytics ∙ Analyze current ETL processes and define improvements ∙ Contribute to data warehouse development using SSAS, SSIS, and SSRS ∙ Redefine and implement strategic improvements to current BI systems ∙ Create custom charts and calculations based on user requirements ∙ Design and deploy business intelligence solutions that meet evolving organizational needs ∙ Use SQL queries, filters, and graphs to enhance data comprehension ∙ Collaborate with users and teams at all levels to drive continuous improvement Required Skills: ∙ Bachelor’s degree in Computer Science, Business Administration, or a related field ∙ Minimum of 6 years of experience in visual reporting and dashboard development ∙ At least 6 years of hands-on Power BI development experience ∙ Strong expertise in SQL Server ∙ Excellent proficiency in Microsoft Office, especially Excel ∙ Strong analytical, problem-solving, and organizational skills ∙ High attention to detail with the ability to manage multiple tasks and meet deadlines If you’re ready to lead impactful BI solutions and work with a talented team, we’d love to hear from you! Skills: data modeling,mdx,microsoft office,sql server,sql,dax,business intelligence solutions,ssis,ssrs,power bi,etl processes,data analysis,ssas,design,excel
Posted 2 weeks ago
5.0 - 8.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
VAM Systems is a Business Consulting, IT Solutions and Services company. VAM Systems is currently looking for Data Engineering Analyst for our Bahrain operations with the following skillsets & terms and conditions: Qualifications : · Bachelor’s Degree · Engineer (B.E.)/ MCA · Certification in SQL/SAS Experience : 5-8 years Key Objectives · Support the finance team on data and analytics activities and Dataware House (DWH) based on a profound knowledge of banking, financial reporting, and data engineering. Analytical/Technical Skills: · Understanding finance and risk reporting systems/workflow with previous experience participating in system implementation is desirable. · Hands-on experience on MS Excel · Prior project management/stakeholder management is desired Responsibilities · Coordinate and Interact with the finance business partner to support daily finance data analysis, hierarchical mappings, and understanding (root cause analysis) of identified data issues. · Exceptional comprehension of finance, risk, and data warehousing to guarantee accurate and reconciled reporting (e.g., balance-sheet exposure, profit and loss). · Mastering the Intersection of Finance, Data Analysis and Data Engineering. · Conduct review of data quality and reconciliations for finance reports and maintenance of reporting logic/programs. · Support the finance team in ad-hoc requests and organizing data for financial/regulatory reports, data mapping and performing UAT. · Ensuring the consistency of bank's data architecture, data flows, and business logic in accordance with Data Management guidelines, development standards, and data architecture by working closely with Finance and data Engineering teams to identify issues and develop sustainable data-driven solutions. · Expertise in writing and documenting complex SQL Query, Procedures, and functions creating algorithms that automate important financial interactions and data controls. · Experience in handling SAS ETL jobs, data transformation, validation, analysis and performance tuning. · SAS skillset, with Strong Experience in SAS Management Console, SAS DI, SAS Enterprise Guide, Base SAS, SAS Web Report Studio, SAS Delivery Portal, SAS OLAP Cube Studio, SAS Information Maps, SAS BI, SAS Stored Process, SAS Datasets & Library Terms and conditions Joining time frame: (15 - 30 days) The selected candidates shall join VAM Systems – Bahrain and shall be deputed to one of the leading bank in Bahrain. Should you be interested in this opportunity, please send your latest resume at the earliest at ashiq.salahudeen@vamsystems.com
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Modeller specializing in GCP and Cloud Databases, you will play a crucial role in designing and optimizing data models for both OLTP and OLAP systems. Your expertise in cloud-based databases, data architecture, and modeling will be essential in collaborating with engineering and analytics teams to ensure efficient operational systems and real-time reporting pipelines. You will be responsible for designing conceptual, logical, and physical data models tailored for OLTP and OLAP systems. Your focus will be on developing and refining models that support performance-optimized cloud data pipelines, implementing models in BigQuery, CloudSQL, and AlloyDB, as well as designing schemas with indexing, partitioning, and data sharding strategies. Translating business requirements into scalable data architecture and schemas will be a key aspect of your role, along with optimizing for near real-time ingestion, transformation, and query performance. You will utilize tools like DBSchema for collaborative modeling and documentation while creating and maintaining metadata and documentation around models. In terms of required skills, hands-on experience with GCP databases (BigQuery, CloudSQL, AlloyDB), a strong understanding of OLTP and OLAP systems, and proficiency in database performance tuning are essential. Additionally, familiarity with modeling tools such as DBSchema or ERWin, as well as a proficiency in SQL, schema definition, and normalization/denormalization techniques, will be beneficial. Preferred skills include functional knowledge of the Mutual Fund or BFSI domain, experience integrating with cloud-native ETL and data orchestration pipelines, and familiarity with schema version control and CI/CD in a data context. In addition to technical skills, soft skills such as strong analytical and communication abilities, attention to detail, and a collaborative approach across engineering, product, and analytics teams are highly valued. Joining this role will provide you with the opportunity to work on enterprise-scale cloud data architectures, drive performance-oriented data modeling for advanced analytics, and collaborate with high-performing cloud-native data teams.,
Posted 2 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Exp : 7yrs to 12yrs Min 8+ years of experience as a Power BI Developer Hand on experience in team & Client handling Expert knowledge using advanced calculations in MS Power BI Desktop, DAX languages like Aggregate, Date, Logical, String, Table etc., Prior experience in connecting Power BI with on-premise and cloud computing platforms A deep understanding of, and ability to use and explain all aspects of, relational database design, multidimensional database design, OLTP, OLAP, KPIs, Scorecards, and Dashboards Very good understanding of DM Techniques for Analytical Data (ie. Facts, Dimensions, Measures) Experience in background in data warehouse design (e.g. dimensional modelling) and data mining Hands on experience in SSIS, SSRS, SSAS is a plus
Posted 2 weeks ago
5.0 years
0 Lacs
Greater Bengaluru Area
On-site
Data Engineer: As an experienced Data Engineer, know what it takes to deliver high quality data solutions to an organization. Skilled in sourcing, extracting, transforming, and loading data, and able to translate business design into database model. Understand the value and benefit of solid data practices. Enjoy exploring and learning new technologies in a collaborative environment. Implement best practices when writing code, maps and data flows for automation. Essential Job Responsibilities: Be actively involved with the team to design and build new data solutions - OLAP and OLTP Develop a variety of data workflows, pipelines, and ETL processes using cloud platform products and internal data management tools Promote data quality and governance automation to ensure the accuracy and quality of the data through inspection, validation, processing, anomaly detection and auto-correction Translate design specifications into working modules with a hands-on approach in the iterative design and development process Write and maintain efficient and reliable code Write technical documentation Required Experience and Education: At least 5 years of experience in Data management Advanced SQL scripting Knowledge of data transformation tools Knowledge of algorithms and data structures Knowledge of any object-oriented programming language Strong communication and collaboration skills Experience with agile methodologies A degree in Computer Science or related field
Posted 2 weeks ago
50.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
About The Opportunity Job Type: Permanent Application Deadline: 30 July 2025 Job Description Title Analyst Programmer Department WPFH Location Gurgaon Level 2 Intro We’re proud to have been helping our clients build better financial futures for over 50 years. How have we achieved this? By working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like you’re part of something bigger. About Your Team The successful candidate would join the Data team . Candidate would be responsible for building data integration and distribution experience to work within the Distribution Data and Reporting team and its consumers. The team is responsible for developing new, and supporting existing, middle tier integration services and business services, and is committed to driving forwards the development of leading edge solutions. About Your Role This role would be responsible for liaising with the technical leads, business analysts, and various product teams to design, develop & trouble shoot the ETL jobs for various Operational data stores. The role will involve understanding the technical design, development and implementation of ETL and EAI architecture using Informatica / ETL tools. The successful candidate will be able to demonstrate an innovative and enthusiastic approach to technology and problem solving, will display good interpersonal skills and show confidence and ability to interact professionally with people at all levels and exhibit a high level of ownership within a demanding working environment. Key Responsibilities Work with Technical leads, Business Analysts and other subject matter experts. Understand the data model / design and develop the ETL jobs Sound technical knowledge on Informatica to take ownership of allocated development activities in terms of working independently Working knowledge on Oracle database to take ownership of the underlying SQLs for the ETL jobs (under guidance of the technical leads) Providing the development estimates Implement standards, procedures and best practices for data maintenance, reconciliation and exception management. Interact with cross functional teams for coordinating dependencies and deliverables. Essential Skils Technical Deep knowledge and Experience of using the Informatica Power Centre tool set min 3 yrs. Experience in Snowflake Experience of Source Control Tools Experience of using job scheduling tools such as Control-M Experience in UNIX scripting Strong SQL or Pl/SQL experience with a minimum of 2 years’ experience Experience in Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation/OLAP and Oracle performance optimisation techniques 3 + Yrs Experience of either Oracle or SQL Server and its utilities coupled with experience of UNIX/Windows Functional 3 + years’ experience of working within financial organisations and broad base business process, application and technology architecture experience Experience with data distribution and access concepts with ability to utilise these concepts in realising a proper physical model from a conceptual one Business facing and ability to work alongside data stewards in systems and the business Strong interpersonal, communication and client facing skills Ability to work closely with cross functional teams About You B.E./B.Tech/MBA/M.C.A/Any other bachelors Degree. At least 3+years of experience in Data Integration and Distribution Experience in building web services and APIs Knowledge of Agile software development life-cycle methodologies Feel rewarded For starters, we’ll offer you a comprehensive benefits package. We’ll value your wellbeing and support your development. And we’ll be as flexible as we can about where and when you work – finding a balance that works for all of us. It’s all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.
Posted 2 weeks ago
50.0 years
5 - 8 Lacs
Gurgaon
On-site
About the Opportunity Job Type: Permanent Application Deadline: 29 July 2025 Job Description Title Analyst Programmer Department WPFH Location Gurgaon Level 2 Intro We’re proud to have been helping our clients build better financial futures for over 50 years. How have we achieved this? By working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like you’re part of something bigger. About your team The successful candidate would join the Data team . Candidate would be responsible for building data integration and distribution experience to work within the Distribution Data and Reporting team and its consumers. The team is responsible for developing new, and supporting existing, middle tier integration services and business services, and is committed to driving forwards the development of leading edge solutions. About your role This role would be responsible for liaising with the technical leads, business analysts, and various product teams to design, develop & trouble shoot the ETL jobs for various Operational data stores. The role will involve understanding the technical design, development and implementation of ETL and EAI architecture using Informatica / ETL tools. The successful candidate will be able to demonstrate an innovative and enthusiastic approach to technology and problem solving, will display good interpersonal skills and show confidence and ability to interact professionally with people at all levels and exhibit a high level of ownership within a demanding working environment. Key Responsibilities Work with Technical leads, Business Analysts and other subject matter experts. Understand the data model / design and develop the ETL jobs Sound technical knowledge on Informatica to take ownership of allocated development activities in terms of working independently Working knowledge on Oracle database to take ownership of the underlying SQLs for the ETL jobs (under guidance of the technical leads) Providing the development estimates Implement standards, procedures and best practices for data maintenance, reconciliation and exception management. Interact with cross functional teams for coordinating dependencies and deliverables. Essential Skils Technical Deep knowledge and Experience of using the Informatica Power Centre tool set min 3 yrs. Experience in Snowflake Experience of Source Control Tools Experience of using job scheduling tools such as Control-M Experience in UNIX scripting Strong SQL or Pl/SQL experience with a minimum of 2 years’ experience Experience in Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation/OLAP and Oracle performance optimisation techniques 3 + Yrs Experience of either Oracle or SQL Server and its utilities coupled with experience of UNIX/Windows Functional 3 + years’ experience of working within financial organisations and broad base business process, application and technology architecture experience Experience with data distribution and access concepts with ability to utilise these concepts in realising a proper physical model from a conceptual one Business facing and ability to work alongside data stewards in systems and the business Strong interpersonal, communication and client facing skills Ability to work closely with cross functional teams About you B.E./B.Tech/MBA/M.C.A/Any other bachelors Degree. At least 3+years of experience in Data Integration and Distribution Experience in building web services and APIs Knowledge of Agile software development life-cycle methodologies Feel rewarded For starters, we’ll offer you a comprehensive benefits package. We’ll value your wellbeing and support your development. And we’ll be as flexible as we can about where and when you work – finding a balance that works for all of us. It’s all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.
Posted 2 weeks ago
4.0 years
0 Lacs
Gurgaon
On-site
A Snapshot of Your Days Your role as a Senior JEDOX Developer is to work daily with global business users who submit tickets via SharePoint or Mailbox. You will also coordinate and work with the appropriate IT development and middleware Teams to find a solution that meets the agreed operation Level Agreement and fix it within the agreed Service Level Agreement. Besides that, you will take part in the monthly closing process where you will coordinate with end users regarding the data entered in the system and verify the same. You will also join the sprint Development meeting to understand and keep up with the ongoing developments. Work closely with collaborators and senior management, expand your network and prepare yourself for future global roles at Siemens Energy. How You’ll Make an Impact Lead the design, development, and implementation of data pipelines and ETL workflows. Manage and optimize workflows to ensure reliable data processing and job scheduling. Design and implement data solutions in database. Ability to be creative and proactive with report design and development using little to no documented requirements Collaborate with cross-functional teams to gather requirements and translate them into scalable data architecture and process designs. Fostering a culture of continuous improvement and innovation. Ensure data quality and integrity by implementing standard processes in data governance and validation. Monitor performance, troubleshoot issues, and optimize data systems for efficiency and scalability. Stay abreast of industry trends and emerging technologies to ensure continuous improvement of the data engineering practices. What You Bring You should be an experienced (6+) IT professional with your graduation in Engineering or other equivalent qualification (MCA). 4+ years of relevant work experience in developing & maintaining ETL workflows. 4+ years of relevant work experience in data analytics, reporting tool like Power BI, Tableau, SAC. 4+ years of relevant work experience in SNOWFLAKE or any cloud database with proven knowledge of writing complex SQL queries. Good to have experience in working in EPM tool like JEDOX, ANAPLAN, TM1 Good to have experience in multidimensional database concepts like OLAP, Cube, Dimensions etc. Good to have experience in developing Power Automate workflows. Good to have experience in Excel Formulas like PIVOT, VLOOKUP etc. Ability to learn new software and technologies quickly and adapt to an ambitious and fast-paced environment. Experience collaborating directly with business users and relevant collaborators. About the Team At Value Center Manufacturing, you will be part of a forward-thinking team that is dedicated to driving digital transformation in manufacturing. Our work is integral to the success of the DC Masterplan and the achievement of the Siemens Energy objectives and key results in manufacturing. You will have the opportunity to contribute to innovative projects that have a significant impact on our business and the industry. The Digital Core enables our Business Areas to achieve their targets by providing best-in-class services and solutions in IT, Strategy & Technology, and more. Who is Siemens Energy? At Siemens Energy, we are more than just an energy technology company. With +100,000 dedicated employees in more than 90 countries, we develop the energy systems of the future, ensuring that the growing energy demand of the global community is met reliably and sustainably. The technologies created in our research departments and factories drive the energy transition and provide the base for one sixth of the world’s electricity generation. Our distributed team is committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible. We uphold a 150-year legacy of innovation that encourages our search for people who will support our focus on decarbonization, new technologies, and energy transformation. Find out how you can make a difference at Siemens Energy: [1] http://www.siemens-energy.com/employeevideo Our Commitment to Diversity Lucky for us, we are not all the same. Through diversity, we generate power. We run on Inclusion, and our combined creative energy is fueled by over 130 nationalities. Siemens Energy celebrates character—no matter what ethnic background, gender, age, religion, identity, or disability. We energize society, all of society, and we do not discriminate based on our differences. Rewards/Benefits All employees are automatically covered under the Medical Insurance. Company paid considerable Family floater cover covering employee, spouse and 2 dependent children up to 25 years of age. Siemens Energy provides an option to opt for Meal Card to all its employees which will be as per the terms and conditions prescribed in the company policy as a part of CTC, tax saving measure Flexi Pay empowers employees with the choice to customize the amount in some of the salary components within a defined range thereby optimizing the tax benefits. Accordingly, each employee is empowered to decide on the best Possible net income out of the same fixed individual base pay on a monthly basis. https://jobs.siemens-energy.com/jobs
Posted 2 weeks ago
5.0 years
0 Lacs
India
Remote
Job Summary Media Group is looking for a highly skilled Data Engineer to join our team on a 6-month remote contract basis. We are seeking an expert in Python data engineering who can build and maintain our data infrastructure and also be able to develop user-friendly web interfaces for our existing Python codebase. The ideal candidate will be a proactive problem-solver, adept at bridging the gap between raw data and actionable insights through both custom applications and powerful Power BI dashboards. Main responsibilities: Design, build, and optimize scalable ETL pipelines using Python and SQL Server Integration Services (SSIS) Develop and integrate functional and intuitive web-based user interfaces for our existing Python applications using frameworks like Django, FastAPI or similar Create, deploy, and manage insightful and interactive dashboards and reports using Power BI to meet diverse business needs Ensure performance, reliability, and data integrity across systems and dashboards. Document data workflows and monitor ETL pipelines Key Requirements : Proven experience as a Data Engineer with a strong portfolio of successful data projects Expertise in Python for data engineering, including deep knowledge of libraries such as Pandas, NumPy, and SQLAlchemy Demonstrable experience developing web user interfaces with Python frameworks like Django or FastAPI Strong proficiency in Power BI dashboard development, including data modeling, DAX queries, and visualization best practices Solid experience with Microsoft SQL Server and SQL Server Integration Services Excellent communicator, must be able to explain technical jargon in layman terms Self-starter, flexible, can-do attitude, capable of multitasking, structured and works well within a team environment. Ability to work with minimal guidance or supervision in a time critical environment. Preferred Skills – Not Mandatory: Exposure to Azure Synapse Analytics, Azure Data Factory, or other Azure data services Experience with cloud-based data architectures, data lakes, or big data platforms A strong background in data warehousing concepts and best practices (e.g., dimensional modeling, ETL/ELT patterns) is highly welcome. Understanding of OLAP cube development using SQL Serve Analysis Services (SSAS) Contract Details: Type: Contract (6 months, with potential for extension) Remote: 100% remote Working Hours: Must be willing to work in the standard work shift : 8 AM – 5 PM, GMT+3 – Qatar time Qualifications Bachelor’s Degree in IT, Computer Science or related field is desired but not mandatory Minimum 5 years’ experience in data warehousing, reporting and analytics related fields Professional certificates in analytics, data engineering or data warehousing will be a plus Other Requirements Candidate must adhere to Media Ethics and Code of Conduct Work timings may change as per business requirements and the candidate is expected to adhere to the same. Candidate will be bound to Information Security policies Work Location : Online Project Type: One Time Expected Start Date : 17 Aug, 2025 Expected End Date: 28 Feb, 2026
Posted 2 weeks ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
A Snapshot of Your Days Your role as a Senior JEDOX Developer is to work daily with global business users who submit tickets via SharePoint or Mailbox. You will also coordinate and work with the appropriate IT development and middleware Teams to find a solution that meets the agreed operation Level Agreement and fix it within the agreed Service Level Agreement. Besides that, you will take part in the monthly closing process where you will coordinate with end users regarding the data entered in the system and verify the same. You will also join the sprint Development meeting to understand and keep up with the ongoing developments. Work closely with collaborators and senior management, expand your network and prepare yourself for future global roles at Siemens Energy. How You’ll Make An Impact Lead the design, development, and implementation of data pipelines and ETL workflows. Manage and optimize workflows to ensure reliable data processing and job scheduling. Design and implement data solutions in database. Ability to be creative and proactive with report design and development using little to no documented requirements Collaborate with cross-functional teams to gather requirements and translate them into scalable data architecture and process designs. Fostering a culture of continuous improvement and innovation. Ensure data quality and integrity by implementing standard processes in data governance and validation. Monitor performance, troubleshoot issues, and optimize data systems for efficiency and scalability. Stay abreast of industry trends and emerging technologies to ensure continuous improvement of the data engineering practices. What You Bring You should be an experienced (6+) IT professional with your graduation in Engineering or other equivalent qualification (MCA). 4+ years of relevant work experience in developing & maintaining ETL workflows. 4+ years of relevant work experience in data analytics, reporting tool like Power BI, Tableau, SAC. 4+ years of relevant work experience in SNOWFLAKE or any cloud database with proven knowledge of writing complex SQL queries. Good to have experience in working in EPM tool like JEDOX, ANAPLAN, TM1 Good to have experience in multidimensional database concepts like OLAP, Cube, Dimensions etc. Good to have experience in developing Power Automate workflows. Good to have experience in Excel Formulas like PIVOT, VLOOKUP etc. Ability to learn new software and technologies quickly and adapt to an ambitious and fast-paced environment. Experience collaborating directly with business users and relevant collaborators. About The Team At Value Center Manufacturing, you will be part of a forward-thinking team that is dedicated to driving digital transformation in manufacturing. Our work is integral to the success of the DC Masterplan and the achievement of the Siemens Energy objectives and key results in manufacturing. You will have the opportunity to contribute to innovative projects that have a significant impact on our business and the industry. The Digital Core enables our Business Areas to achieve their targets by providing best-in-class services and solutions in IT, Strategy & Technology, and more. Who is Siemens Energy? At Siemens Energy, we are more than just an energy technology company. With +100,000 dedicated employees in more than 90 countries, we develop the energy systems of the future, ensuring that the growing energy demand of the global community is met reliably and sustainably. The technologies created in our research departments and factories drive the energy transition and provide the base for one sixth of the world’s electricity generation. Our distributed team is committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible. We uphold a 150-year legacy of innovation that encourages our search for people who will support our focus on decarbonization, new technologies, and energy transformation. Find out how you can make a difference at Siemens Energy: [1] http://www.siemens-energy.com/employeevideo Our Commitment to Diversity Lucky for us, we are not all the same. Through diversity, we generate power. We run on Inclusion, and our combined creative energy is fueled by over 130 nationalities. Siemens Energy celebrates character—no matter what ethnic background, gender, age, religion, identity, or disability. We energize society, all of society, and we do not discriminate based on our differences. Rewards/Benefits All employees are automatically covered under the Medical Insurance. Company paid considerable Family floater cover covering employee, spouse and 2 dependent children up to 25 years of age. Siemens Energy provides an option to opt for Meal Card to all its employees which will be as per the terms and conditions prescribed in the company policy as a part of CTC, tax saving measure Flexi Pay empowers employees with the choice to customize the amount in some of the salary components within a defined range thereby optimizing the tax benefits. Accordingly, each employee is empowered to decide on the best Possible net income out of the same fixed individual base pay on a monthly basis. https://jobs.siemens-energy.com/jobs
Posted 2 weeks ago
6.0 - 9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Data Schema Designer – GCP Platforms Location: Chennai (Work From Office) Experience Required: 6 to 9 Years Role Overview We are hiring a Data Schema Designer who will focus on designing clean, extensible, and high-performance schemas for GCP data platforms. This role is crucial in standardizing data design, enabling scalability, and ensuring cross-system consistency. Key Responsibilities Create and maintain unified data schema standards across BigQuery, CloudSQL, and AlloyDB Collaborate with engineering and analytics teams to identify modeling best practices Ensure schema alignment with ingestion pipelines, transformations, and business rules Develop entity relationship diagrams and schema documentation templates Assist in automation of schema deployments and version control Must-Have Skills Expert knowledge in schema design principles for GCP platforms Proficiency with schema documentation tools (e.g., DBSchema, dbt docs) Deep understanding of data normalization, denormalization, and indexing strategies Hands-on experience with OLTP and OLAP schemas Preferred Skills Exposure to CI/CD workflows and Git-based schema management Experience in metadata governance and data cataloging Soft Skills Precision and clarity in technical documentation Collaboration mindset with attention to performance and quality Why Join Be the backbone of reliable and scalable data systems Influence architectural decisions through thoughtful schema design Work with modern cloud data stacks and enterprise data teams Skills: gcp,denormalization,metadata governance,data,olap schemas,git-based schema management,ci/cd workflows,data cataloging,schema documentation tools (e.g., dbschema, dbt docs),indexing strategies,oltp schemas,collaboration,analytics,technical documentation,schema design principles for gcp platforms,data normalization,schema
Posted 2 weeks ago
6.0 - 9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Data Modeler with Expertise in DBSchema & GCP Location: Chennai (Work From Office) Experience Required: 6 to 9 Years Role Overview We are hiring a Data Modeler with proven hands-on experience using DBSchema in GCP environments. This role will focus on designing highly maintainable and performance-tuned data models for OLTP and OLAP systems using modern modeling tools and practices. Key Responsibilities Develop conceptual, logical, and physical models with DBSchema for cloud environments Align schema design with application requirements and analytics consumption Ensure proper indexing, normalization/denormalization, and partitioning for performance Support schema documentation, reverse engineering, and visualization in DBSchema Review and optimize models in BigQuery, CloudSQL, and AlloyDB Must-Have Skills Expertise in DBSchema modeling tool and collaborative schema documentation Strong experience with GCP databases: BigQuery, CloudSQL, AlloyDB Knowledge of OLTP and OLAP system structures and performance tuning Proficiency in SQL and schema evolution/versioning best practices Preferred Skills Experience integrating DBSchema with CI/CD pipelines Knowledge of real-time ingestion pipelines and federated schema design Soft Skills Detail-oriented, organized, and communicative Comfortable presenting schema design to cross-functional teams Why Join Leverage industry-leading tools in modern GCP environments Improve modeling workflows and documentation quality Contribute to enterprise data architecture with visibility and impact Skills: gcp,dbschema,olap,modeling,data,cloudsql,pipelines,alloydb,sql,oltp,bigquery,schema
Posted 2 weeks ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Data Model Developer – Cloud & On-Premise Location: Chennai (Work From Office) Experience Required: 6 to 9 Years Role Overview We are looking for a versatile Data Model Developer who is proficient in designing robust data models across cloud and on-premise environments. You will collaborate with cross-functional teams to develop schemas that serve both operational systems and analytical use cases. Key Responsibilities Design and implement scalable data models for both cloud (GCP) and traditional RDBMS Support hybrid data architectures that integrate real-time and batch workflows Collaborate with engineering teams to ensure seamless schema implementation Document conceptual, logical, and physical models Assist in ETL and data pipeline alignment with schema definitions Monitor and refine performance through partitioning and indexing strategies Must-Have Skills Experience with GCP data services: BigQuery, CloudSQL, AlloyDB Proficiency in relational databases such as PostgreSQL, MySQL, or Oracle Solid grounding in OLTP/OLAP modeling principles Familiarity with schema design tools like DBSchema, ER/Studio SQL expertise for query performance optimization Preferred Skills Experience working in hybrid cloud/on-prem data architectures Functional knowledge in BFSI or asset management domains Knowledge of metadata management and schema versioning Soft Skills Adaptability to cloud and legacy tech stacks Ability to communicate clearly with engineers and analysts Strong documentation and collaboration skills Why Join Contribute to dual-mode data architecture (cloud + on-prem) Solve real-world data design challenges in regulated industries Opportunity to influence platform migration and modernization Skills: schema,oracle,postgresql,er/studio,data models,sql,data,alloydb,olap modeling,bigquery,oltp modeling,mysql,dbschema,gcp data services,gcp,cloudsql
Posted 2 weeks ago
6.0 - 9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Functional Data Modeler – Mutual Fund Industry Preferred Location: Chennai (Work From Office) Experience Required: 6 to 9 Years Role Overview We are seeking a Functional Data Modeler with a strong background in mutual fund or BFSI domains. This role combines functional domain understanding with robust data modeling expertise to create schemas that align with operational and regulatory needs. Key Responsibilities Design data models that reflect fund structures, NAV calculations, asset allocation, and compliance workflows Collaborate with business analysts and product teams to translate functional requirements into data structures Ensure that models are compliant with data privacy, regulatory reporting, and audit requirements Build OLTP and OLAP data models to serve real-time and aggregated reporting Document metadata, lineage, and data dictionaries for business consumption Must-Have Skills Domain expertise in Mutual Fund / BFSI operations Solid data modeling experience for financial and regulatory systems Proficiency in schema design on GCP platforms (BigQuery, CloudSQL) Strong SQL and hands-on with modeling tools like DBSchema or ER/Studio Preferred Skills Experience working with fund management platforms or reconciliation engines Familiarity with financial compliance standards (e.g., SEBI, FATCA) Soft Skills Strong business acumen and documentation capabilities Ability to liaise between functional and technical teams effectively Why Join Own critical financial data architecture Influence domain-driven modeling for financial ecosystems Join a fast-paced data transformation journey in the BFSI sector Skills: strong sql,schema design on gcp platforms (bigquery, cloudsql),data modeling experience for financial and regulatory systems,modeling,modeling tools (dbschema, er/studio),domain expertise in mutual fund / bfsi operations,data,bfsi,data modeling
Posted 2 weeks ago
6.0 - 9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Cloud Data Modeler – GCP, BigQuery, CloudSQL Location: Chennai (Work From Office) Experience Required: 6 to 9 Years Role Overview Join us as a Cloud Data Modeler focusing on GCP environments, supporting enterprise-scale analytical and transactional systems. Your role will span the design of schema architecture, creation of performance-efficient data models, and guiding teams on the best practices of cloud-based data integration and usage. Key Responsibilities Architect and implement scalable data models for cloud data warehouses and databases Model OLTP/OLAP systems optimized for reporting, analytics, and application access Support cloud data lake and warehouse architecture, ensuring schema alignment Review and optimize existing schemas for cost and performance on GCP Define documentation standards and ensure model version tracking Collaborate with DevOps and DataOps teams for deployment consistency Must-Have Skills Deep knowledge of GCP data platforms – BigQuery, CloudSQL, AlloyDB Expertise in data modeling, normalization, dimensional modeling Understanding of distributed query engines, table partitioning, and clustering Familiarity with DBSchema or similar tools Preferred Skills Prior experience in BFSI or asset management industries Working experience with Data Catalogs, lineage, and governance tools Soft Skills Collaborative and consultative mindset Strong communication and requirements gathering skills Organized and methodical approach to data architecture challenges Why Join Contribute to modern data architecture in a cloud-first enterprise Influence critical decisions around GCP-based data infrastructure Be part of a future-ready data strategy implementation team Skills: dimensional modeling,table partitioning,data,alloydb,normalization,bigquery,distributed query engines,data models,data architecture,data modeling,clustering,dbschema,gcp,cloudsql
Posted 2 weeks ago
6.0 - 9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Data Modeller – GCP & Cloud Databases Location: Chennai (Work From Office) Experience Required: 6 to 9 Years Role Overview We are looking for a hands-on Data Modeller with strong expertise in cloud-based databases, data architecture, and modeling for OLTP and OLAP systems. You will work closely with engineering and analytics teams to design and optimize conceptual, logical, and physical data models, supporting both operational systems and near real-time reporting pipelines. Key Responsibilities Design conceptual, logical, and physical data models for OLTP and OLAP systems Develop and refine models that support performance-optimized cloud data pipelines Collaborate with data engineers to implement models in BigQuery, CloudSQL, and AlloyDB Design schemas and apply indexing, partitioning, and data sharding strategies Translate business requirements into scalable data architecture and schemas Optimize for near real-time ingestion, transformation, and query performance Use tools such as DBSchema or similar for collaborative modeling and documentation Create and maintain metadata and documentation around models Must-Have Skills Hands-on experience with GCP databases: BigQuery, CloudSQL, AlloyDB Strong understanding of OLTP vs OLAP systems and respective design principles Experience in database performance tuning: indexing, sharding, and partitioning Skilled in modeling tools such as DBSchema, ERWin, or similar Understanding of variables that impact performance in real-time/near real-time systems Proficient in SQL, schema definition, and normalization/denormalization techniques Preferred Skills Functional knowledge of the Mutual Fund or BFSI domain Experience integrating with cloud-native ETL and data orchestration pipelines Familiarity with schema version control and CI/CD in a data context Soft Skills Strong analytical and communication skills Detail-oriented and documentation-focused Ability to collaborate across engineering, product, and analytics teams Why Join Work on enterprise-scale cloud data architectures Drive performance-first data modeling for advanced analytics Collaborate with high-performing cloud-native data teams Skills: olap,normalization,indexing,gcp databases,sharding,olap systems,modeling,schema definition,sql,data,oltp systems,alloydb,erwin,modeling tools,bigquery,database performance tuning,databases,partitioning,denormalization,dbschema,cloudsql
Posted 2 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description: Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications: Bachelor's degree in Engineering or a related field. 3 to 5 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams.
Posted 2 weeks ago
5.0 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications Bachelor's degree in Engineering or a related field. 5 to 9 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams.
Posted 2 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Summary We are looking for a skilled and passionate Full Stack Data Developer with hands-on experience in Apache Pinot , Cube.js , and TypeScript to join our data engineering team. The ideal candidate will help build scalable, real-time analytics platforms and APIs, enabling business users and applications to interact with high-performance datasets through rich dashboards and interfaces. Key Responsibilities Design and develop real-time OLAP data pipelines using Apache Pinot. Integrate and build analytics APIs using Cube.js, connected to various data sources. Build frontend components and dashboards using TypeScript and modern frameworks (React, Next.js, etc.). Optimize query performance and ensure the scalability of the analytics stack. Collaborate with data engineers, backend developers, and product stakeholders to deliver high-impact features. Maintain documentation and contribute to best practices around data modeling, API development, and observability. Required Qualifications Strong experience with Apache Pinot and real-time analytics use cases. Proficient in Cube.js for data modeling, API layer development, and pre-aggregations. Solid experience in TypeScript and frontend frameworks like React or Vue.js. Familiarity with data pipelines, ingestion tools (Kafka, stream processors), and columnar storage concepts. Experience building and deploying microservices, APIs, and visualization components. Good understanding of RESTful APIs, SQL, and performance tuning for analytics workloads.
Posted 2 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Summary We are looking for a skilled and passionate Full Stack Data Developer with hands-on experience in Apache Pinot , Cube.js , and TypeScript to join our data engineering team. The ideal candidate will help build scalable, real-time analytics platforms and APIs, enabling business users and applications to interact with high-performance datasets through rich dashboards and interfaces. Key Responsibilities Design and develop real-time OLAP data pipelines using Apache Pinot. Integrate and build analytics APIs using Cube.js, connected to various data sources. Build frontend components and dashboards using TypeScript and modern frameworks (React, Next.js, etc.). Optimize query performance and ensure the scalability of the analytics stack. Collaborate with data engineers, backend developers, and product stakeholders to deliver high-impact features. Maintain documentation and contribute to best practices around data modeling, API development, and observability. Required Qualifications Strong experience with Apache Pinot and real-time analytics use cases. Proficient in Cube.js for data modeling, API layer development, and pre-aggregations. Solid experience in TypeScript and frontend frameworks like React or Vue.js. Familiarity with data pipelines, ingestion tools (Kafka, stream processors), and columnar storage concepts. Experience building and deploying microservices, APIs, and visualization components. Good understanding of RESTful APIs, SQL, and performance tuning for analytics workloads.
Posted 2 weeks ago
10.0 - 15.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Staff Software Engineer (Java) at Walmart Global Tech in Chennai, you will play a crucial role in guiding the team in making architectural decisions and implementing best practices for building scalable applications. Your responsibilities will include driving design, development, and documentation, as well as building, testing, and deploying cutting-edge solutions that impact associates of Walmart worldwide. You will collaborate with Walmart engineering teams globally, engage with Product Management and Business to drive the agenda, and work closely with Architects and cross-functional teams to deliver solutions meeting Quality, Cost, and Delivery standards. To excel in this role, you should have a Bachelor's/Masters degree in Computer Science or a related field with a minimum of 10 years of experience in software design, development, and automated deployments. Your expertise should include delivering highly scalable Java applications, strong system design skills, knowledge of CS Fundamentals, Microservices, Data Structures, Algorithms, and proficiency in writing modular and testable code. Experience with Java, Spring Boot, Kafka, and Spark, as well as working in cloud-based solutions, is essential. You should also have a good understanding of microservices architecture, distributed concepts, design principles, and cloud native development. Additionally, your skills should encompass working with relational and NoSQL databases, caching technologies, event-based systems like Kafka, and monitoring tools like Prometheus and Splunk. Experience with containerization tools such as Docker, Helm, and Kubernetes, as well as knowledge of public cloud platforms like Azure and GCP, will be advantageous in this role. At Walmart Global Tech, you will work in an innovative environment where your contributions can impact millions of people. You will have the opportunity to grow your career, gain new skills, and collaborate with experts in the field. The company offers a flexible, hybrid work model, competitive compensation, and a range of benefits including maternity and parental leave, health benefits, and more. Walmart is committed to creating a culture of belonging where every associate feels valued and respected, fostering inclusivity and diversity across its global team. Join Walmart Global Tech to be part of a team that is shaping the future of retail, innovating at scale, and making a positive impact on the world.,
Posted 2 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Purpose We are looking for a Senior SQL Developer to join our growing team of BI & analytics experts. The hire will be responsible for expanding and optimizing our data and data queries, as well as optimizing data flow and collection for consumption by our BI & Analytics platform. The ideal candidate is an experienced data querying builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The SQL Developer will support our software developers, database architects, data analysts and data scientists on data and product initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. The hire must be self-directed and comfortable supporting the data needs of multiple systems and products. The right candidate will be excited by the prospect of optimizing our company’s data architecture to support our next generation of products and data initiatives. Job Responsibilities Essential Functions: Requirements Create and maintain optimal SQL queries, Views, Tables, Stored Procedures. Work together with various business units (BI, Product, Reporting) to develop data warehouse platform vision, strategy, and roadmap. Understand the development of physical and logical data models. Ensure high-performance access to diverse data sources. Encourage the adoption of an organization’s frameworks by providing documentation, sample code, and developer support. Communicate progress on the adoption and effectiveness of the developed frameworks to department head and managers. Required Education And Experience Bachelor’s or Master’s degree or equivalent combination of education and experience in relevant field. Understanding of T-SQL, Data Warehouses, Star Schema, Data Modeling, OLAP, SQL and ETL Experiencing in Creating Table, Views, Stored Procedures. Understanding of several BI and Reporting Platforms, and be aware of industry trends and direction in BI/reporting and applicability to the organization’s product strategies. Skilled in multiple database platforms, including SQL Server and MySQL. Knowledgeable of Source Control and Project Management tools like Azure DevOps, Git, and JIRA Familiarity of using SonarQube for clean coding T-SQL practices. Familiarity with DevOps best practices and automation of documentation, testing, build, deployment, configuration, and monitoring Communication skills: It is vital that applicants have exceptional written and spoken communication skills with active listening abilities to contribute in making strategic decisions and advise senior management on specialized technical issues, which will have an impact on the business Strong team building skills: it is crucial that they also have team building ability to provide direction for complex projects, mentor junior team members, and communicate the organization’s preferred technologies and frameworks across development teams. Experience: A candidate for this position must have had at least 5+ years working in a data warehousing position within a fast-paced and complex business environment, working as a SQL Developer. The candidate must also have had experience developing schema data models in a data warehouse environment. The candidate will also have had experience with full implementation of system development lifecycle (SDLC). The candidate must also have a proven and successful experience working with concepts of data integration, consolidation, enrichment, and aggregation. A suitable candidate will also have a strong demonstrated understanding of dimensional modeling and similar data warehousing techniques as well as having experience working with relational or multi-dimensional databases and business intelligence architectures. Analytical Skills: As expected, a candidate for the position will have passion as well as skill in research and analytics as well as a passion for data management tools and technologies. The candidate must have an ability to perform detailed data analysis, for example, in determining the content, structure, and quality of data through the examination of data samples and source systems. The hire will additionally have the ability to troubleshoot data warehousing issues and quickly resolve them. Expected Competencies Detailed oriented with strong organizational skills Ability to pay attention to programming style and neatness Strong English communication skills, both written and verbal Ability to train, mentor junior colleagues with patience with tangible results Work Timings This is a full-time position. Days and hours of work are Monday through Friday, and should be flexible to support different time zones ranging between 12 PM IST to 9PM IST, Work schedule may include evening hours or weekends due to client needs per manager instructions This role will be working in Hybrid Mode and will require at least 2 days’ work from office at Hyderabad. Occasional evening and weekend work may be expected in case of job-related emergencies or client needs. EEO Statement Cendyn provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Cendyn complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. Cendyn expressly prohibits any form of workplace harassment based on race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, genetic information, disability, or veteran status. Improper interference with the ability of Cendyn’s employees to perform their job duties may result in discipline up to and including discharge. Other Duties Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee for this job. Duties, responsibilities and activities may change at any time with or without notice.
Posted 2 weeks ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Where Data Does More. Join the Snowflake team. We are looking for a Solutions Architect to be part of our Professional Services team to deploy cloud products and services for our customers' Global Competency Centers located in India. This person must be a hands-on, self-starter who loves solving innovative problems in a fast-paced, agile environment. The ideal candidate will have the insight to connect a specific business problem and Snowflake’s solution and communicate that connection and vision to various technical and executive audiences. This person will have a broad range of skills and experience ranging from data architecture to ETL, security, performance analysis, analytics, etc. He/she will have the insight to make the connection between a customer’s specific business problems and Snowflake’s solution, the customer-facing skills to communicate that connection and vision to a wide variety of technical and executive audiences, and the technical skills to be able to not only build demos and execute proof-of-concepts but also to provide consultative assistance on architecture and implementation. The person we’re looking for shares our passion for reinventing the data platform and thrives in a dynamic environment. That means having the flexibility and willingness to jump in and get it done to make Snowflake and our customers successful. It means keeping up to date on the ever-evolving data and analytics technologies, and working collaboratively with a broad range of people inside and outside the company to be an authoritative resource for Snowflake and its customers. AS A SOLUTIONS ARCHITECT AT SNOWFLAKE YOU WILL: Be a technical expert on all aspects of Snowflake Present Snowflake technology and vision to executives and technical contributors to customers. Position yourself as a Trusted Advisor to key customer stakeholders with a focus on achieving their desired Business Outcomes. Drive project teams towards common goals of accelerating the adoption of Snowflake solutions. Demonstrate and communicate the value of Snowflake technology throughout the engagement, from demo to proof of concept to running workshops, design sessions and implementation with customers and stakeholders. Create repeatable processes and documentation as a result of customer engagement. Collaborate on and create Industry based solutions that are relevant to other customers in order to drive more value out of Snowflake. Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are correctly enabled and can extend the capabilities of Snowflake on their own. Maintain a deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them. Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Be able to position and sell the value of Snowflake professional services for ongoing delivery OUR IDEAL SOLUTIONS ARCHITECT WILL HAVE: Minimum 10 years of experience working with customers in a pre-sales or post-sales technical role University degree in computer science, engineering, mathematics or related fields, or equivalent experience Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos Understanding of complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools Strong skills in databases, data warehouses, and data processing Extensive hands-on expertise with SQL and SQL analytics Proficiency in implementing data security measures, access controls, and design within the Snowflake platform. Extensive knowledge of and experience with large-scale database technology (e.g. Netezza, Exadata, Teradata, Greenplum, etc.) Software development experience with Python, Java , Spark and other Scripting languages Internal and/or external consulting experience. Deep collaboration with Account Executives and Sales Engineers on account strategy. BONUS POINTS FOR EXPERIENCE WITH THE FOLLOWING: 1+ years of practical Snowflake experience Experience with non-relational platforms and tools for large-scale data processing (e.g. Hadoop, HBase) Familiarity and experience with common BI and data exploration tools (e.g. Microstrategy, Looker, Tableau, PowerBI) OLAP Data modeling and data architecture experience Experience and understanding of large-scale infrastructure-as-a-service platforms (e.g. Amazon AWS, Microsoft Azure, GCP, etc.) Experience using AWS services such as S3, Kinesis, Elastic MapReduce, Data pipeline Experience delivering data migration projects Expertise in a core vertical such as Financial Services, Retail, Media & Entertainment, Healthcare, Life-Sciences etc. Hands-on experience with Python, Java or Scala. WHY JOIN OUR PROFESSIONAL SERVICES TEAM AT SNOWFLAKE: Unique opportunity to work on a truly disruptive software product Get unique, hands-on experience with bleeding edge data warehouse technology Develop, lead and execute an industry-changing initiative Learn from the best! Join a dedicated, experienced team of professionals. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com
Posted 2 weeks ago
5.0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
On-site
Job Description NIQ is looking for a Senior Data Engineer to join our Financial Services Engineering team. At NIQ, the Financial Services team uses alternative datasets to help global public equity investors (hedge funds, mutual funds, pension funds) make better investment decisions. We work with some of the largest hedge funds in the world. As a Senior Data Engineer, you will be at the cutting edge of the alternative data space where you will help maintain and improve our data infrastructure, which enables us to develop market research products and delivery data to our customers. In this role, you would also get the opportunity to work with world-class big data and cloud services, such as: AWS, Azure, Snowflake, Databricks, DBT, Airflow, and Looker. Apply now to start taking your career to the next level. Who we are looking for: You have a strong entrepreneurial spirit and a thirst to solve difficult challenges through innovation and creativity with a strong focus on results You have a passion for data and the insights it can deliver You are intellectually curious with a broad range of interests and hobbies You take ownership of your deliverables You have excellent analytical communication and interpersonal skills You have excellent communication skills with both technical and non-technical audiences You can work with distributed teams situated globally in different geographies You want to work in a small team with a start-up mentality You can work well under pressure, prioritize work and be well organized. Relish tackling new challenges, paying attention to details, and, ultimately, growing professionally Responsibilities: Develop robust data flows to connect and maintain large-scale data processing systems, data for analytics, and BI systems 5+ years of hands-on programming experience with Python and SQL including familiarity with stored procedures, Snowflake and dbt 5+ years of experience with PySpark to design, optimize and scale distributed data processing pipelines Experience working on data modeling Familiarity with a cloud provider (AWS, GCP, Azure) and their data infrastructure services (e.g., S3, EC2, etc.) Utilize programming languages like Python or JavaScript to build robust data pipelines and implement ETL processes Ensure data quality and accessibility for end-users. Recommend ways to improve data reliability, efficiency, and quality Collaborate with data scientists, business stakeholders, and IT team members on project goals Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or related field 5+ years’ experience as a Data Engineer, Database Developer, or similar role Strong knowledge and experience with SQL Server, Stored Procedures, and OLAP BI tools Expertise in MDX or similar query languages for complex data analysis Experience in building and optimizing ‘big data’ pipelines, architectures, and data sets Strong organizational skills with an ability to manage multiple projects and priorities Knowledge of Databricks, Spark, Snowflake, or Airflow will be a plus Additional Information Enjoy a flexible and rewarding work environment with peer-to-peer recognition platforms Recharge and revitalize with help of wellness plans made for you and your family Plan your future with financial wellness tools Stay relevant and upskill yourself with career development opportunities Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough