Jobs
Interviews

277 Mdx Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. We’re the team that developed the Mashup Engine (M) and Power Query. We already ship monthly to millions of users across Excel, Power/Pro BI, Flow, and PowerApps; but in many ways we’re just getting started. We’re building new services, experiences, and engine capabilities that will broaden the reach of our technologies to several new areas – data “intelligence”, large-scale data analytics, and automated data integration workflows. We plan to use example-based interaction, machine learning, and innovative visualization to make data access and transformation even more intuitive for non-technical users. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Engine layer: designing and implementing components for dataflow orchestration, distributed querying, query translation, connecting to external data sources, and script parsing/interpretation Service layer: designing and implementing infrastructure for a containerized, micro services based, high throughput architecture UI layer: designing and implementing performant, engaging web user interfaces for datavisualization/exploration/transformation/connectivity and dataflow management Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience Experience in data integration or migrations or ELT or ETL tooling is mandatory Preferred/Additional Qualifications BS degree in Computer Science Engine role: familiarity with data access technologies (e.g. ODBC, JDBC, OLEDB, ADO.Net, OData), query languages (e.g. T-SQL, Spark SQL, Hive, MDX, DAX), query generation/optimization, OLAP UI role: familiarity with JavaScript, TypeScript, CSS, React, Redux, webpack Service role: familiarity with micro-service architectures, Docker, Service Fabric, Azure blobs/tables/databases, high throughput services Full-stack role: a mix of the qualifications for the UX/service/backend roles Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 15 hours ago

Apply

5.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Company Description Making Trade Happen Coface is a team of 4,500 people of 78 nationalities across nearly 60 countries, all sharing a corporate culture across the world. Together, we work towards one objective: facilitating trade by helping our 50,000 corporate clients develop their businesses. With 75 years of experience, Coface is a leader in the credit insurance and risk management market. We have also developed a range of other value-added services, including factoring, debt collection, Single Risk insurance, bonding, and information services. As a close-knit, international organisation at the core of the global economy, Coface offers an enriching work experience on several levels: relational, professional, and cultural. Every day, our teams are making trade happen. Join us! Job Description MISSION : We are seeking an experienced and highly motivated professional to join our team in a role focused on stakeholder management, Power BI dashboard development, and data analysis. The ideal candidate will collaborate with cross-functional teams to address data needs, develop actionable insights through advanced Power BI dashboards, and manage complex data landscapes. Key responsibilities include designing and maintaining Power BI reports, ensuring data accuracy, conducting in-depth analysis to identify trends, and presenting findings to senior stakeholders. The role also requires strong communication skills to translate complex technical concepts into clear, actionable insights for non-technical stakeholders. Main Responsibilities Key Requirements: Proficiency in Power BI, data visualization, and coding. Strong analytical skills and ability to synthesize insights from large, complex datasets. Experience managing stakeholder expectations and providing training on BI tools. Transform complex data into easily understandable insights Create multi-dimensional data models that are well-adjusted data warehousing practices. Execute security at the row level in the Power BI application with an apt understanding of the application security layer models. Data visualization using best practices with high end-user focus Qualifications Technical Skills : 5-10 years of overall experience in software development 5+ years of dedicated experience in Power BI Well versed with all BI and DWH (Data Ware Housing) concepts and architecture Experience in working with clients in the APAC region preferably in Insurance industry Familiarity with the tools and technologies used by the Microsoft SQL Server BI Stack, including SSRS and TSQL, Power Query, MDX, PowerBI, and DAX. Power BI Technical Skills - Power BI Desktop & Service, Data Modeling (DAX & Relationships), Power Query (M Language), Data Visualization & UI Design, Paginated Reports (Power BI Report Builder), Power Automate Integration Data Skills Oracle, SQL, Data Warehousing, Data Cleansing & Transformation Business & Analytical Skills -Requirement Gathering, Data Storytelling, KPI & Metrics Development Administration & Security - Row-Level Security (RLS), Power BI Service Administration Design, build, maintain, and map data models to process raw data from unrelated sources. Proficient in financial reporting through Power BI Strong knowledge of Oracle & SQL and relational databases. Expertise of SQL queries, SSRS, and SQL Server Integration Services (SSIS) In-depth understanding of the overall development process for listed tools: Data extraction from various data sources (like SAP ERP, SAP BW, Oracle, Teradata, Snowflake) Knowledge on Scripts to import data from databases, flat files, log files. Understanding of general accounting principles and financial reporting Additional Information Flexible working model: After the 1st month Great place to work: central and brand-new offices Opportunities to learn: 450Euro budget every year for training, languages platform, e-learning platform, dedicated development program Career opportunities: Opportunity to build your career (both locally and internationally) in a large global company, one of the world leaders in its field Health care: Show more Show less

Posted 19 hours ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Avaya Avaya is an enterprise software leader that helps the world’s largest organizations and government agencies forge unbreakable connections. The Avaya Infinity™ platform unifies fragmented customer experiences, connecting the channels, insights, technologies, and workflows that together create enduring customer and employee relationships. We believe success is built through strong connections – with each other, with our work, and with our mission. At Avaya, you'll find a community that values your contributions and supports your growth every step of the way. Learn more at https://www.avaya.com Job Information Job Code: 00270233 Job Family: Information Technology Job Function: Applications About The Job Business Intelligence developer with 5-8 years’ experience designing and developing end-to-end BI solutions using Kimball methodology and the Microsoft BI Stack (SQL Server, SSIS, SSAS and SSRS). Knowledge of Google Data platform would be a plus. About The Roles And Responsibilities Meet with business users to review requirements, propose, design, document and implement solutions. Support and train business users on using the solution. Participate in code reviews, brown bag sessions and advocate best practices during development Support is required for US business hours. About The Skills And Qualifications Using SSIS as an ETL tool to design packages for high performance. Can develop SSIS script components and tasks using C# or VB. Designing multidimensional cubes in SSAS (creating calculations, actions, security). Developing parameterized/hierarchical reports in SSRS using SSAS as a data source. Knowledgeable using MDX and T-SQL to query data. Familiar with source control systems such as TFS. Familiar with using Excel to analyze data. Experience 5 - 8 Years of Experience Education Bachelor degree or equivalent experience Footer Avaya is an Equal Opportunity employer and a U.S. Federal Contractor. Our commitment to equality is a core value of Avaya. All qualified applicants and employees receive equal treatment without consideration for race, religion, sex, age, sexual orientation, gender identity, national origin, disability, status as a protected veteran or any other protected characteristic. In general, positions at Avaya require the ability to communicate and use office technology effectively. Physical requirements may vary by assigned work location. This job brief/description is subject to change. Nothing in this job description restricts Avaya right to alter the duties and responsibilities of this position at any time for any reason. You may also review the Avaya Global Privacy Policy (accessible at https://www.avaya.com/en/privacy/policy/) and applicable Privacy Statement relevant to this job posting (accessible at https://www.avaya.com/en/documents/info-applicants.pdf).

Posted 2 days ago

Apply

75.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Company Description Making Trade Happen Coface is a team of 4,500 people of 78 nationalities across nearly 60 countries, all sharing a corporate culture across the world. Together, we work towards one objective: facilitating trade by helping our 50,000 corporate clients develop their businesses. With 75 years of experience, Coface is a leader in the credit insurance and risk management market. We have also developed a range of other value-added services, including factoring, debt collection, Single Risk insurance, bonding, and information services. As a close-knit, international organisation at the core of the global economy, Coface offers an enriching work experience on several levels: relational, professional, and cultural. Every day, our teams are making trade happen. Join us! Job Description MISSION : We are seeking an experienced and highly motivated professional to join our team in a role focused on stakeholder management, Power BI dashboard development, and data analysis. The ideal candidate will collaborate with cross-functional teams to address data needs, develop actionable insights through advanced Power BI dashboards, and manage complex data landscapes. Key responsibilities include designing and maintaining Power BI reports, ensuring data accuracy, conducting in-depth analysis to identify trends, and presenting findings to senior stakeholders. The role also requires strong communication skills to translate complex technical concepts into clear, actionable insights for non-technical stakeholders. Main Responsibilities Key Requirements: Proficiency in Power BI, data visualization, and coding. Strong analytical skills and ability to synthesize insights from large, complex datasets. Experience managing stakeholder expectations and providing training on BI tools. Transform complex data into easily understandable insights Create multi-dimensional data models that are well-adjusted data warehousing practices. Execute security at the row level in the Power BI application with an apt understanding of the application security layer models. Data visualization using best practices with high end-user focus Qualifications Technical Skills : 5-10 years of overall experience in software development 5+ years of dedicated experience in Power BI Well versed with all BI and DWH (Data Ware Housing) concepts and architecture Experience in working with clients in the APAC region preferably in Insurance industry Familiarity with the tools and technologies used by the Microsoft SQL Server BI Stack, including SSRS and TSQL, Power Query, MDX, PowerBI, and DAX. Power BI Technical Skills - Power BI Desktop & Service, Data Modeling (DAX & Relationships), Power Query (M Language), Data Visualization & UI Design, Paginated Reports (Power BI Report Builder), Power Automate Integration Data Skills – Oracle, SQL, Data Warehousing, Data Cleansing & Transformation Business & Analytical Skills -Requirement Gathering, Data Storytelling, KPI & Metrics Development Administration & Security - Row-Level Security (RLS), Power BI Service Administration Design, build, maintain, and map data models to process raw data from unrelated sources. Proficient in financial reporting through Power BI Strong knowledge of Oracle & SQL and relational databases. Expertise of SQL queries, SSRS, and SQL Server Integration Services (SSIS) In-depth understanding of the overall development process for listed tools: Data extraction from various data sources (like SAP ERP, SAP BW, Oracle, Teradata, Snowflake) Knowledge on Scripts to import data from databases, flat files, log files. Understanding of general accounting principles and financial reporting Additional Information Flexible working model: After the 1st month Great place to work: central and brand-new offices Opportunities to learn: 450Euro budget every year for training, languages platform, e-learning platform, dedicated development program… Career opportunities: Opportunity to build your career (both locally and internationally) in a large global company, one of the world leaders in its field Health care:

Posted 2 days ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

kills Must have 7+ years of hands-on experience in BI development with a focus on SSIS and SSAS. Familiarity with all aspects of SDLC. Detailed experience with SQL Server, Analysis Services, Integration Services, Reporting Services (SSRS and PowerBI), and MDX queries for Cubes. Experience in SSAS multi-cube Excellent system design skills in a SQL Server Business Intelligence. Experienced with Source control GIT, Jenkins. Domain Knowledge Knowledge of Banking, Markets / Treasury products highly Desirable. Ability to be able to handle the complexity and dynamic nature of the Financial Services environment, requirement applications to adapt, be flexible, and learn quickly in a complex environment. Nice to have Experience with other BI tools such as Power BI or Tableau. Knowledge of data warehousing concepts and technologies (e.g., Azure Data Factory, Snowflake, or Google BigQuery). Familiarity with Agile methodologies and DevOps practices for CI/CD in BI development. Knowledge of MDX (Multidimensional Expressions) and DAX (Data Analysis Expressions). Experience in automating and scheduling jobs using SQL Server Agent or third-party tools. Exposure to cloud-based BI solutions like Azure Synapse Analytics or AWS Redshift. Understanding of financial data and reporting requirements

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

JD for the MDx - P1 position – Molecular Junior Molecular Biologist Main responsibilities Documentations of MDx lab from receipt to processing. This includes excels for sample receipt at MDx, sample comments, preliminary processing observations etc Assist with tracking DNA (storage) and help in timely retrieval of the same as per client’s request. Document the retrieval appropriately Assist with preanalytical steps such as sample inspection and comment documentation, TRFs scanning + copying. Conduct DNA extraction from different types of samples Perform sample dissection – applicable for chorionic villi and other tissue sample types Carry out Quantification of DNA as needed Agarose gel electrophoresis Assist and later carry out other molecular tests – PCR test, QFPCR, MCC, NGS library preparation Assist in tracking and maintaining molecular lab inventory (consumables and kits) Learn and assist in conduct of other new tests. Independently conduct new tests as required Assist with validation of different new tests as needed Assist and conduct SOPs drafting and proofreading. Any other work allotted by the manager and Lab Director from time-to-time Assist and cross-train in other departments such as molecular laboratory tests or biochemical genetic screening tests as required Required Competence Education - Preferred: M.Sc. Genetics/ Molecular Biology; Biotechnology; other related life sciences or Biology/ Biotechnology with at least basic knowledge of laboratory work. Bachelor's degree in relevant field- biotechnology, genetics may be considered contingent on non availability of preferred degree candidates and based on urgency Required Work Experience Related To Position – Any prior molecular biology experience will be an advantage. Special Requirements / Skills / Attributes DNA Isolation DNA quantification Agarose gel electrophoresis Handling Biological Samples PCR Good Laboratory Practice Additional Skills: Cell / Tissue Culture Microsoft Office 2010 Quality control in biological assays Validating biological assays Soft Skill: Good Communication. Motivation to work hard and if needed extra hours as well. Motivation to learn new skills. Competency Required Strategic Perspective: Successfully complete individual goals set as part of the Goal Setting process Ensure the preliminary process – DNA isolation- is always completed in a timely and quality manner to ensure sample success within proposed TAT. Market Focus Ensures quality is always given priority and maintained at the high standards of the company Leadership: Takes charge of trainees, new staff, and ensures all documentation and preliminary processes are completed Takes initiative to increase personal effectiveness and performance Participate in discussions pertaining to new tests Present views and suggestions for new tests and process improvement of existing tests

Posted 2 days ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

You will be working as a Business Intelligence Engineer with insightsoftware, a global provider of comprehensive solutions for the Office of the CFO. Your primary responsibility will involve analyzing business requirements to model, design, deploy, and maintain data warehouse implementations for various customers across different verticals. Your work timings will be based on the EST Time Zone (9AM-6PM) and you will be located in Hyderabad. Reporting to the Professional Services Manager, you will collaborate with a team to engage with business stakeholders, customers, end users, and team members to develop analytics and reporting applications encompassing dashboards, scorecards, and performance analytics. Your tasks will include translating business requirements into data models, assisting in estimation and implementation planning, designing and deploying data warehouses, ETL processes using SQL Server Integration Services, cubes for multi-dimensional analysis, and gathering requirements from customers to scope projects. To qualify for this position, you should have 6 to 9 years of experience with 5+ years in SQL/TSQL, a good understanding of MDX and DAX, 3+ years of experience with dimensional models, Microsoft SQL Server Management Studio including SSIS and SSAS, designing and deploying ETL architectures and packages, proficiency in working with Azure and reporting tools, effective communication skills with customers and internal cross-functional partners, as well as strong troubleshooting and problem-solving abilities. Please note that all information will be kept confidential as per EEO guidelines. insightsoftware is unable to provide sponsorship to candidates who are not eligible to work in the country where the position is located. Background checks may be required for employment with insightsoftware, where permitted by country, state/province.,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

The Essbase ETL and Application Developer position based in Chennai involves delivering solutions that extract data from upstream systems and develop load scripts for Oracle Essbase backend applications. As an Essbase application engineer, you will be responsible for maintaining and enhancing budgeting, forecasting, and long-term planning systems. You will serve as a Subject Matter Expert (SME) in supporting multiple Planning and Consolidation applications in Oracle Essbase environments, both IaaS/PaaS and SaaS EPM Cloud. Your tasks will include performance tuning, optimization of applications, and providing system support to users. You should be comfortable working both independently and as part of a team. Key responsibilities include conducting various testing phases, implementing partitioning, automation, optimization, and performance tuning of Essbase application data/metadata processing. You will also be involved in developing and maintaining Block Storage Outline (BSO) / Aggregate Storage Outline (ASO) cubes, creating data forms, calculation scripts, automation using MAXL, batch scripts, and business rules. The ideal candidate should have at least 5 years of experience in Oracle 19C or 21C IaaS/PaaS Essbase applications, with an additional 2+ years of experience in Oracle EPM Hyperion Planning design, development, or administration being a plus. Proficiency in writing complex code in Calculation Scripts & Business Rules is essential. Experience in EPBCS components such as metadata administration, outlines, dimensions, complex calculations, and security setup is highly beneficial. Moreover, you should possess expertise in metadata upload processes and creating automations for seamless metadata loading. Being proactive in providing alternative solutions based on best practices and application functionality is a key attribute. Financial process and functional knowledge are preferred, along with the ability to analyze business needs and develop solutions to support the business effectively. In summary, the Essbase ETL and Application Developer role in Chennai requires a candidate with a strong background in Oracle Essbase applications, ETL processes, and a proactive approach to problem-solving and application development.,

Posted 2 days ago

Apply

4.0 - 11.0 years

0 Lacs

haryana

On-site

You will be responsible for developing visual reports, dashboards, and KPI scorecards using Power BI desktop. Connecting Power BI to various data sources, importing data, and transforming data as per project requirements will be a key part of your role. You should possess the ability to create data flows, including complex ones and ETL flows over PBI. Writing SQL queries and having a strong understanding of database fundamentals such as multidimensional database design and relational database design is essential. Proficiency in creating complex DAX queries, Power Query, bookmarks, and SQL is required. Implementing row-level security on data and comprehending application security layer models in Power BI are crucial aspects of the job. Additionally, developing tabular/multidimensional models that align with warehouse standards using M-Query is part of your responsibilities. As a suitable candidate, you should hold a Bachelors/Master's degree in computer science/engineering, operations research, or related analytics areas. Candidates with BA/BS degrees in the same fields from top-tier academic institutions are also encouraged to apply. Having 4-11 years of relevant experience in analytics is preferred. Expertise in model development for reports, data transformations, and modeling using Power BI Query Editor is necessary. You will be required to develop and integrate with various data sources, including using varied data connectors to in-house or cloud data stores to stage and shape data for reporting and analytics solutions. Creating relationships between data and developing tabular and other multidimensional data models will be part of your daily tasks. Handling complex data queries and client requirements with hands-on experience on tools and systems in MS SQL Server BI Stack, including SSRS, TSQL, Power Query, MDX, Power BI, Power Pivot, and DAX is crucial for successfully designing, modeling, and implementing end-to-end Power BI solutions. Experience in data gateway for data refresh is expected. Investigating and troubleshooting Power BI reports and data models, including resolving data issues, data validation, and balancing, will be part of your responsibilities. Knowledge of the Insurance Industry is preferred. Excellent written and verbal communication skills are essential for this role. A notice period of 15-30 days is preferable.,

Posted 3 days ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Description IN Data Engineering & Analytics(IDEA) Team is looking to hire a rock star Data Engineer to build and manage the largest petabyte-scale data infrastructure in India for Amazon India businesses. IN Data Engineering & Analytics (IDEA) team is the central Data engineering and Analytics team for all A.in businesses. The team's charter includes 1) Providing Unified Data and Analytics Infrastructure (UDAI) for all A.in teams which includes central Petabyte-scale Redshift data warehouse, analytics infrastructure and frameworks for visualizing and automating generation of reports & insights and self-service data applications for ingesting, storing, discovering, processing & querying of the data 2) Providing business specific data solutions for various business streams like Payments, Finance, Consumer & Delivery Experience. The Data Engineer will play a key role in being a strong owner of our Data Platform. He/she will own and build data pipelines, automations and solutions to ensure the availability, system efficiency, IMR efficiency, scaling, expansion, operations and compliance of the data platform that serves 200 + IN businesses. The role sits in the heart of technology & business worlds and provides opportunity for growth, high business impact and working with seasoned business leaders. An ideal candidate will be someone with sound technical background in managing large data infrastructures, working with petabyte-scale data, building scalable data solutions/automations and driving operational excellence. An ideal candidate will be someone who is a self-starter that can start with a Platform requirement & work backwards to conceive and devise best possible solution, a good communicator while driving customer interactions, a passionate learner of new technology when the need arises, a strong owner of every deliverable in the team, obsessed with customer delight, business impact and ‘gets work done’ in business time. Key job responsibilities Design/implement automation and manage our massive data infrastructure to scale for the analytics needs of Amazon IN. Build solutions to achieve BAA(Best At Amazon) standards for system efficiency, IMR efficiency, data availability, consistency & compliance. Enable efficient data exploration, experimentation of large datasets on our data platform and implement data access control mechanisms for stand-alone datasets Design and implement scalable and cost effective data infrastructure to enable Non-IN(Emerging Marketplaces and WW) use cases on our data platform Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL, Amazon and AWS big data technologies Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. Drive operational excellence strongly within the team and build automation and mechanisms to reduce operations Enjoy working closely with your peers in a group of very smart and talented engineers. A day in the life India Data Engineering and Analytics (IDEA) team is central data engineering team for Amazon India. Our vision is to simplify and accelerate data driven decision making for Amazon India by providing cost effective, easy & timely access to high quality data. We achieve this by providing UDAI (Unified Data & Analytics Infrastructure for Amazon India) which serves as a central data platform and provides data engineering infrastructure, ready to use datasets and self-service reporting capabilities. Our core responsibilities towards India marketplace include a) providing systems(infrastructure) & workflows that allow ingestion, storage, processing and querying of data b) building ready-to-use datasets for easy and faster access to the data c) automating standard business analysis / reporting/ dash-boarding d) empowering business with self-service tools to manage data and generate insights. Basic Qualifications 1+ years of data engineering experience Experience with SQL Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A3044196

Posted 3 days ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Description IN Data Engineering & Analytics(IDEA) Team is looking to hire a rock star Data Engineer to build and manage the largest petabyte-scale data infrastructure in India for Amazon India businesses. IN Data Engineering & Analytics (IDEA) team is the central Data engineering and Analytics team for all A.in businesses. The team's charter includes 1) Providing Unified Data and Analytics Infrastructure (UDAI) for all A.in teams which includes central Petabyte-scale Redshift data warehouse, analytics infrastructure and frameworks for visualizing and automating generation of reports & insights and self-service data applications for ingesting, storing, discovering, processing & querying of the data 2) Providing business specific data solutions for various business streams like Payments, Finance, Consumer & Delivery Experience. The Data Engineer will play a key role in being a strong owner of our Data Platform. He/she will own and build data pipelines, automations and solutions to ensure the availability, system efficiency, IMR efficiency, scaling, expansion, operations and compliance of the data platform that serves 200 + IN businesses. The role sits in the heart of technology & business worlds and provides opportunity for growth, high business impact and working with seasoned business leaders. An ideal candidate will be someone with sound technical background in managing large data infrastructures, working with petabyte-scale data, building scalable data solutions/automations and driving operational excellence. An ideal candidate will be someone who is a self-starter that can start with a Platform requirement & work backwards to conceive and devise best possible solution, a good communicator while driving customer interactions, a passionate learner of new technology when the need arises, a strong owner of every deliverable in the team, obsessed with customer delight, business impact and ‘gets work done’ in business time. Key job responsibilities Design/implement automation and manage our massive data infrastructure to scale for the analytics needs of Amazon IN. Build solutions to achieve BAA(Best At Amazon) standards for system efficiency, IMR efficiency, data availability, consistency & compliance. Enable efficient data exploration, experimentation of large datasets on our data platform and implement data access control mechanisms for stand-alone datasets Design and implement scalable and cost effective data infrastructure to enable Non-IN(Emerging Marketplaces and WW) use cases on our data platform Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL, Amazon and AWS big data technologies Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. Drive operational excellence strongly within the team and build automation and mechanisms to reduce operations Enjoy working closely with your peers in a group of very smart and talented engineers. A day in the life India Data Engineering and Analytics (IDEA) team is central data engineering team for Amazon India. Our vision is to simplify and accelerate data driven decision making for Amazon India by providing cost effective, easy & timely access to high quality data. We achieve this by providing UDAI (Unified Data & Analytics Infrastructure for Amazon India) which serves as a central data platform and provides data engineering infrastructure, ready to use datasets and self-service reporting capabilities. Our core responsibilities towards India marketplace include a) providing systems(infrastructure) & workflows that allow ingestion, storage, processing and querying of data b) building ready-to-use datasets for easy and faster access to the data c) automating standard business analysis / reporting/ dash-boarding d) empowering business with self-service tools to manage data and generate insights. Basic Qualifications 1+ years of data engineering experience Experience with SQL Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A3044205

Posted 3 days ago

Apply

4.0 years

6 - 8 Lacs

Chennai

On-site

Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Job Description Summary: As a QA Test Engineer – Tosca, you will be an important member of Rockwell Automation's Global Quality Assurance Centre of Excellence. You will collaborate with teams to lead quality plans, implement automation strategies, and ensure best practices across the software development lifecycle. You will report to the Engineering Manager - IT and work in a hybrid capacity from our Chennai, India office. Your Responsibilities: Develop and implement test strategies, automation frameworks, and test cases for functional, regression, performance, and data integrity testing. Collaborate with business and development teams to ensure test coverage, shift-left practices, and quality improvements. Provide training, workshops, and governance to promote QA best practices across teams. Manage test environments, tools, and data configurations to support enterprise-wide testing efforts. Conduct root cause analysis, code reviews, and contribute to architectural improvements for testability. The Essentials - You Will Have: Bachelor's degree in Computer Science, Engineering, or related field, or equivalent professional experience. Tosca Certifications: AS1 and AS2. 4+ years of experience in test automation or software development. Hands-on experience with test automation tools (e.g., Tosca, Selenium, UFT), test management tools (e.g., Jira, qTest), and scripting languages. Familiarity with Agile methodologies and DevOps practices. The Preferred - You Might Also Have: Experience with Tosca DI and data integrity testing. Proficiency in SQL, DAX, MDX, and ETL/data pipeline testing. Exposure to performance testing tools such as jMeter, Gatling, or Postman. Knowledge of CI/CD tools like Azure DevOps and Jenkins. Understanding of BDD frameworks and enterprise test tool administration. What We Offer: Our benefits package includes … Comprehensive mindfulness programmes with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching programme – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalised wellbeing programmes through our OnTrack programme On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. #LI-Hybrid #LI-SM1 Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.

Posted 4 days ago

Apply

3.0 - 6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Qualification OLAP, Data Engineering, Data warehousing, ETL Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Experience working on Hive or Spark SQL or Redshift or Snowflake Experience in writing and troubleshooting SQL programming or MDX queries Experience of working on Linux Experience in Microsoft Analysis services (SSAS) or OLAP tools Tableau or Micro strategy or any BI tools Expertise of programming in Python, Java or Shell Script would be a plus Role Be frontend person of the world’s most scalable OLAP product company – Kyvos Insights. Interact with senior-most technical and business people of large enterprises to understand their big data strategy and their problem statements in that area. Create, present, align customers with and implement solutions around Kyvos products for the most challenging enterprise BI/DW problems. Be the Go-To person for prospects regarding technical issues during POV stage. Be instrumental in reading the pulse of the big data market and defining the roadmap of the product. Lead a few small but highly efficient teams of Big data engineers Efficient task status reporting to stakeholders and customer. Good verbal & written communication skills Be willing to work on off hours to meet timeline. Be willing to travel or relocate as per project requirement Experience 3 to 6 years Job Reference Number 10350

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Qualification Required Proven hands-on experience on designing, developing and supporting Database projects for analysis in a demanding environment. Proficient in database design techniques – relational and dimension designs Experience and a strong understanding of business analysis techniques used. High proficiency in the use of SQL or MDX queries. Ability to manage multiple maintenance, enhancement and project related tasks. Ability to work independently on multiple assignments and to work collaboratively within a team is required. Strong communication skills with both internal team members and external business stakeholders Added Advanatage Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Experience working on Hive or Spark SQL or Redshift or Snowflake will be an added advantage. Experience of working on Linux system Experience of Tableau or Micro strategy or Power BI or any BI tools will be an added advantage. Expertise of programming in Python, Java or Shell Script would be a plus Role Roles & Responsibilities Be frontend person of the world’s most scalable OLAP product company – Kyvos Insights. Interact with senior-most technical and business people of large enterprises to understand their big data strategy and their problem statements in that area. Create, present, align customers with and implement solutions around Kyvos products for the most challenging enterprise BI/DW problems. Be the Go-To person for customers regarding technical issues during the project. Be instrumental in reading the pulse of the big data market and defining the roadmap of the product. Lead a few small but highly efficient teams of Big data engineers Efficient task status reporting to stakeholders and customer. Good verbal & written communication skills Be willing to work on off hours to meet timeline. Be willing to travel or relocate as per project requirement Experience 5 to 10 years Job Reference Number 11078

Posted 4 days ago

Apply

3.0 - 6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Qualification Pre-Sales Solution Engineer - India Experience Areas Or Skills Pre-Sales experience of Software or analytics products Excellent verbal & written communication skills OLAP tools or Microsoft Analysis services (MSAS) Data engineering or Data warehouse or ETL Hadoop ecosystem or AWS, Azure or GCP Cluster and processing Tableau or Micro strategy or any BI tool Hive QL or Spark SQL or PLSQL or TSQL Writing and troubleshooting SQL programming or MDX queries Working on Linux programming in Python, Java or Java Script would be a plus Filling RFP or Questioner from Customer NDA, Success Criteria, Project closure and other Documentation Be willing to travel or relocate as per requirement Role Acts as main point of contact for Customer contacts involved in the evaluation process Product demonstrations to qualified leads Product demonstrations in support of marketing activity such as events or webinars Own RFP, NDA, PoC success criteria document, POC Closure and other documents Secures alignment on Process and documents with the customer / prospect Owns the technical win phases of all active opportunities Understand Customer domain and database schema Providing OLAP and Reporting solution Work closely with customers for understanding and resolving environment or OLAP cube or reporting related issues Co-ordinate with solutioning team for execution of PoC as per success plan Creates enhancement requests or identify requests for new features on behalf of customers or hot prospects Experience 3 to 6 years Job Reference Number 10771

Posted 4 days ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description As a QA Test Engineer – Tosca, you will be an important member of Rockwell Automation's Global Quality Assurance Centre of Excellence. You will collaborate with teams to lead quality plans, implement automation strategies, and ensure best practices across the software development lifecycle. You will report to the Engineering Manager - IT and work in a hybrid capacity from our Chennai, India office. Your Responsibilities Develop and implement test strategies, automation frameworks, and test cases for functional, regression, performance, and data integrity testing. Collaborate with business and development teams to ensure test coverage, shift-left practices, and quality improvements. Provide training, workshops, and governance to promote QA best practices across teams. Manage test environments, tools, and data configurations to support enterprise-wide testing efforts. Conduct root cause analysis, code reviews, and contribute to architectural improvements for testability. The Essentials - You Will Have Bachelor's degree in Computer Science, Engineering, or related field, or equivalent professional experience. Tosca Certifications: AS1 and AS2. 4+ years of experience in test automation or software development. Hands-on experience with test automation tools (e.g., Tosca, Selenium, UFT), test management tools (e.g., Jira, qTest), and scripting languages. Familiarity with Agile methodologies and DevOps practices. The Preferred - You Might Also Have Experience with Tosca DI and data integrity testing. Proficiency in SQL, DAX, MDX, and ETL/data pipeline testing. Exposure to performance testing tools such as jMeter, Gatling, or Postman. Knowledge of CI/CD tools like Azure DevOps and Jenkins. Understanding of BDD frameworks and enterprise test tool administration. What We Offer Our benefits package includes … Comprehensive mindfulness programmes with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching programme – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalised wellbeing programmes through our OnTrack programme On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.

Posted 4 days ago

Apply

4.0 years

0 Lacs

Delhi, India

On-site

Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description As a QA Test Engineer – Tosca, you will be an important member of Rockwell Automation's Global Quality Assurance Centre of Excellence. You will collaborate with teams to lead quality plans, implement automation strategies, and ensure best practices across the software development lifecycle. You will report to the Engineering Manager - IT and work in a hybrid capacity from our Chennai, India office. Your Responsibilities Develop and implement test strategies, automation frameworks, and test cases for functional, regression, performance, and data integrity testing. Collaborate with business and development teams to ensure test coverage, shift-left practices, and quality improvements. Provide training, workshops, and governance to promote QA best practices across teams. Manage test environments, tools, and data configurations to support enterprise-wide testing efforts. Conduct root cause analysis, code reviews, and contribute to architectural improvements for testability. The Essentials - You Will Have Bachelor's degree in Computer Science, Engineering, or related field, or equivalent professional experience. Tosca Certifications: AS1 and AS2. 4+ years of experience in test automation or software development. Hands-on experience with test automation tools (e.g., Tosca, Selenium, UFT), test management tools (e.g., Jira, qTest), and scripting languages. Familiarity with Agile methodologies and DevOps practices. The Preferred - You Might Also Have Experience with Tosca DI and data integrity testing. Proficiency in SQL, DAX, MDX, and ETL/data pipeline testing. Exposure to performance testing tools such as jMeter, Gatling, or Postman. Knowledge of CI/CD tools like Azure DevOps and Jenkins. Understanding of BDD frameworks and enterprise test tool administration. What We Offer Our benefits package includes … Comprehensive mindfulness programmes with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching programme – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalised wellbeing programmes through our OnTrack programme On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are an experienced Oracle EPM Planning Consultant responsible for supporting the implementation, enhancement, and optimization of Oracle EPM solutions. Your expertise lies in Oracle Planning and Budgeting Cloud Service (PBCS) / Enterprise Planning and Budgeting Cloud Service (EPBCS), financial modeling, forecasting, and integration with other enterprise applications. Strong skills in Hyperion Planning, Essbase, data management, and scripting (Groovy, MDX, SQL) are preferred. Your role involves hands-on configuration, stakeholder collaboration, and troubleshooting for seamless planning and reporting processes. Your responsibilities include leading the design, configuration, and implementation of Oracle EPBCS solutions. You will develop and customize EPBCS modules such as Financials, Workforce, Capital, Projects, and Strategic Modeling. Furthermore, you will collaborate with finance teams to understand planning, budgeting, and forecasting needs, and design and implement driver-based planning, rolling forecasts, and scenario analysis. You will develop and optimize business rules, Groovy scripts, and calculation scripts for dynamic planning processes. Data integration and management tasks include configuring Data Management (DM/FDMEE) for seamless data loads from ERP and other systems, developing ETL processes, and ensuring data accuracy and consistency across financial planning models. Performance optimization and troubleshooting are key aspects of your role, where you will optimize EPBCS models, Essbase cubes, and calculation scripts for improved performance. Additionally, you will conduct training sessions, create user documentation, and provide post-implementation support to ensure smooth user adoption and compliance with best practices. Gathering business requirements, translating them into EPBCS functional designs, and collaborating with finance, IT, and business teams to align EPBCS with enterprise financial processes are also part of your responsibilities. You will manage project deliverables, timelines, and risk mitigation strategies to ensure successful project outcomes. To qualify for this position, you should have a Bachelor's degree in finance, accounting, business, computer science, information systems, or a related field. Additionally, 3 to 6 years of hands-on experience in implementing Oracle EPBCS / PBCS solutions is required. Strong technical skills in configuring EPBCS modules, designing business rules, Groovy scripts, and calculation scripts, as well as proficiency in Smart View, Forms, Task Lists, Data Management (DM/FDMEE), and Essbase cube optimization are necessary. Furthermore, you should possess deep functional and domain knowledge in financial planning, budgeting, forecasting, and variance analysis, along with experience in full-cycle EPBCS implementations and project management methodologies. Strong analytical, communication, stakeholder management, and problem-solving skills are essential for this role. Inoapps focuses on delivering innovative Oracle On-Premises and Oracle Cloud applications and technology solutions to clients. By choosing Inoapps, you will receive support throughout your Oracle journey, working in partnership to deliver superior solutions with lasting value.,

Posted 5 days ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As an o9 Solutions Consultant, you will play a crucial role in transforming enterprise planning processes through the AI-first approach of the o9 platform. Your primary responsibility will be to ensure the seamless integration of planning capabilities for global enterprises, thereby unlocking significant value and enhancing operational efficiency. By resolving customer issues promptly, validating data accuracy, and refining end-to-end workflows, you will contribute to optimizing supply chains and driving better outcomes for businesses and the planet. Your expertise in industry best practices, technical architecture, and o9 solutions will be put to use as you configure the platform based on change requests and deliver solutions to complex operational and supply chain challenges. You will also collaborate with other consultants to address design issues, conduct workflow and data analytics tests, and provide training to end-users globally. Additionally, your role will involve actively participating in the enhancement of internal processes and product features based on customer feedback. To excel in this position, you must possess a minimum of 2 years of experience in implementing planning applications and hold a degree in Btech/BE/MCA/Mtech. Proficiency in languages such as SQL, MDX, T-SQL, as well as statistical, optimization, and simulation skills, will be advantageous. A deep understanding of supply chain planning concepts, strong leadership abilities, effective communication skills, and the capacity to analyze and prioritize data are essential characteristics for success in this role. At o9, we value teamwork, transparency, and continuous communication. You can expect a competitive salary, supportive work environment, opportunities for growth, and a diverse international culture. Join us in our mission to digitally transform planning and decision-making for enterprises worldwide, and be a part of a high-energy, values-driven organization committed to being the most valuable partner to our clients. o9 Solutions is a rapidly growing enterprise SaaS company with a mission to drive digital transformation in enterprise decision-making. With the o9 Digital Brain as our premier AI-powered platform, we are at the forefront of enabling major global enterprises to achieve groundbreaking transformations. As a part of our team, you will have the opportunity to work with industry leaders and contribute to shaping the future of enterprise planning. If you are looking to be part of a dynamic and innovative organization that values diversity, inclusion, and continuous improvement, consider joining o9 Solutions on our journey towards AI-powered management and 10x improvements in enterprise decision-making. Experience the excitement of driving profitable growth, reducing inefficiencies, and creating lasting value with us.,

Posted 5 days ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You are seeking a skilled and detail-oriented SSAS and Azure Analyst with 8 to 10 years of relevant experience to design, develop, and maintain data models using Azure Analysis Services and SQL Server Analysis Services (SSAS), both OLAP and Tabular. Collaborating closely with data teams and business stakeholders, you will be responsible for delivering scalable, accurate, and insightful analytics solutions. Key Responsibilities: - Design and develop Tabular models in Azure Analysis Services and/or SQL Server Analysis Services (SSAS) based on business requirements. - Translate reporting needs into functional and scalable data models by collaborating with business analysts, data engineers, and reporting teams. - Design, implement, and optimize ETL processes feeding into SSAS/AAS models to ensure high availability and data freshness. - Refine model architecture, indexing, partitions, and DAX measures to optimize cube performance. - Conduct model validation, unit testing, and data integrity checks for accuracy and reliability of analytical outputs. - Enhance existing SSAS models to meet evolving business needs and provide support for Power BI, Excel, and other BI tools. - Document technical specifications, DAX measures, KPIs, metadata, and data flow processes, and monitor and troubleshoot data refresh failures and performance bottlenecks. Must-Have Skills: - Hands-on experience with SSAS Tabular and/or Multidimensional (OLAP) models. - Proficiency in DAX, MDX, data modeling principles, and BI best practices. - Experience with Azure Analysis Services (AAS), SQL Server Analysis Services (SSAS), and ETL workflows. - Strong SQL skills for query optimization and familiarity with Power BI and Excel. Nice-to-Have Skills: - Familiarity with Azure Data Factory, Azure Synapse, or other cloud data tools. - Basic understanding of CI/CD processes for deploying data models and exposure to Agile delivery methodologies and version control. Soft Skills: - Excellent communication and collaboration skills to work with cross-functional teams. - Strong problem-solving and analytical thinking, detail-oriented with a commitment to data accuracy and model performance. You will be joining a team that values your expertise in Tabular models in Azure Analysis Services and OLAP and/or Tabular models in SQL Server Analysis Services, along with your ETL skills. The hiring process will involve Screening (HR Round), Technical Round 1, Technical Round 2, and Final HR Round.,

Posted 5 days ago

Apply

4.0 - 9.0 years

15 - 25 Lacs

Kolkata

Work from Office

Inviting applications for the role of Senior Principal Consultant-Power BI Developer! Responsibilities: • Working within a team to identify, design and implement a reporting/dashboarding user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices Gathering query data from tables of industry cognitive model/data lake and build data models with BI tools Apply requisite business logic using data transformation and DAX Understanding on Power BI Data Modelling and various in-built functions Knowledge on reporting sharing through Workspace/APP, Access management, Dataset Scheduling and Enterprise Gateway • Understanding of static and dynamic row level security Ability to create wireframes based on user stories and Business requirement Basic Understanding on ETL and Data Warehousing concepts Conceptualizing and developing industry specific insights in forms dashboards/reports/analytical web application to deliver of Pilots/Solutions following best practices Qualifications we seek in you! Minimum Qualifications Graduate Proficient in Power BI report development and data modeling. • Strong analytical skills and ability to work independently. • Experience in developing and implementing solutions in Power BI. 1 • Expertise in creating data models for report development in Power BI. • Strong SQL skills and ability to interpret data. • Proficient in overall testing of code and functionality. • Optional: Knowledge of Snowflake. • Preferred: Experience in finance projects/ Financial System Knowledge

Posted 1 week ago

Apply

1.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Description Want to participate in building the next generation of online payment system that supports multiple countries and payment methods? Amazon Payment Services (APS) is a leading payment service provider in MENA region with operations spanning across 8 countries and offers online payment services to thousands of merchants. APS team is building robust payment solution for driving the best payment experience on & off Amazon. Over 100 million customers send tens of billions of dollars moving at light-speed through our systems annually. We build systems that process payments at an unprecedented scale with accuracy, speed and mission-critical availability. We innovate to improve customer experience, with support for currency of choice, in-store payments, pay on delivery, credit and debit card payments, seller disbursements and gift cards. Many new exciting & challenging ideas are in the works. Key job responsibilities Key job responsibilities Data Engineers focus on managing data requests, maintaining operational excellence, and enhancing core infrastructure. You will be collaborating closely with both technical and non-technical teams to design and execute roadmaps Basic Qualifications 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ - K20 Job ID: A3042356

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Coimbatore

Work from Office

Should have at least 3 years of experience with Power BI Should have at least 3 years of experience with SSAS, OLAP with strong knowledge in MDX query Should have strong knowledge in working with SSRS and web reporting Should have at least 3 years of experience with SQL Server or Oracle Should have strong knowledge in data marts and data warehouses Should have experience in working with SSIS and DTS process Should have excellent communication and interpersonal skills Envision Software Engineering offers excellent pay with benefits, excellent growth opportunity, and good working conditions with a challenging job profile.

Posted 1 week ago

Apply

6.0 - 10.0 years

15 - 18 Lacs

Gurugram

Work from Office

Job Title: BI Architect Location: Gurgaon, Haryana Job Summary: We are looking for an experienced Business Intelligence (BI) Architect to lead the design and implementation of scalable BI solutions across the enterprise. The ideal candidate will have a strong background in data architecture, analytics platforms, and cloud technologies, with proven experience integrating SAP, Salesforce, and Azure-based systems. Key Responsibilities: Design and implement enterprise-grade BI architecture that supports business objectives. Define and enforce BI standards, governance, and best practices. Lead integration efforts across SAP, Salesforce, and other enterprise systems. Develop optimized data models (Qlik/Snowflake schemas) for reporting and analytics. Oversee ETL/ELT pipelines and ensure seamless data flow across platforms. Architect and manage BI solutions on Azure Cloud, including services like Azure Synapse, Data Factory, and Power BI. Ensure high availability, scalability, and performance of cloud-based BI systems. Develop dashboards and reports using tools such as Power BI and Qlik. Translate complex data into actionable insights for stakeholders. Work closely with cross-functional teams including data engineers, analysts, and business stakeholders. Mentor junior BI professionals and contribute to team development. Ensure data accuracy, consistency, and security across BI platforms. Implement data governance policies and compliance standards. Required Skills & Qualifications: Proven experience in designing and deploying enterprise-scale BI solutions. Ensure data accuracy, consistency, and security across BI platforms. Implement data governance policies and compliance standards. Bachelors or master’s degree in computer science, Information Systems, or related field. 6–9 years of experience in BI architecture and development. Strong proficiency in SQL, DAX, MDX, and data visualization tools. Hands-on experience with SAP BW/HANA, Salesforce data models, and Azure cloud services. Excellent problem-solving, communication, and stakeholder management skills. Familiarity with Agile/Scrum methodologies. Role & responsibilities

Posted 1 week ago

Apply

2.0 - 6.0 years

2 - 10 Lacs

Pune, Maharashtra, India

On-site

To be successful as a Senior Reference Data Developer, you should have experience with: Basic/ Essential Qualifications: Educational Qualification : BE/B. Tech/MCA. Good years of IT experience. Skillset - GoldenSource, Java is must . Reference data SME having essential knowledge to build Ref data platform using GoldenSource product. Development experience in MDX, Business rules, Publishing and UI in GS8. 8. Database Oracle/PostgreSQL. Desirable skillsets/ good to have: Cloud : Experience in AWS/GCP will be preferred. Domain knowledge - Fixed income/ Equities/Listed derivatives . Python, Shell scripting good to have. This role will be based out of Pune. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization s technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc). to solve problems creatively and effectively. Communicate complex information. Complex information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies