Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
45 - 50 Lacs
Bengaluru
Work from Office
Amazon s Consumer Payments organization is seeking a highly quantitative, experienced Data Engineer to drive growth through analytics, automation of data pipelines, and enhancement of self-serve experiences. . You will succeed in this role if you are an organized self-starter who can learn new technologies quickly and excel in a fast-paced environment. In this position, you will be a key contributor and sparring partner, developing analytics and insights that global executive management teams and business leaders will use to define global strategies and deep dive businesses. You will be part the team that is focused on acquiring new merchants from around the world to payments around the world. The position is based in India but will interact with global leaders and teams in Europe, Japan, US, and other regions. You should be highly analytical, resourceful, customer focused, team oriented, and have an ability to work independently under time constraints to meet deadlines. You will be comfortable thinking big and diving deep. A proven track record in taking on end-to-end ownership and successfully delivering results in a fast-paced, dynamic business environment is strongly preferred. Responsibilities include but not limited to: Design, develop, implement, test, and operate large-scale, high-volume, high-performance data structures for analytics and Reporting. Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, AWS Redshift, and OLAP technologies, Model data and metadata for ad hoc and pre-built reporting. Work with product tech teams and build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark. Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Interface with business customers, gathering requirements and delivering complete reporting solutions. Collaborate with Analysts, Business Intelligence Engineers and Product Managers to implement algorithms that exploit rich data sets for statistical analysis, and machine learning. Participate in strategic tactical planning discussions, including annual budget processes. Communicate effectively with product / business / tech-teams / other Data teams. 3+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
By clicking the “Apply” button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’s Privacy Notice and Terms of Use. I further attest that all information I submit in my employment application is true to the best of my knowledge. Job Description: The Future Begins Here: At Takeda, we are leading digital evolution and global transformation. By building innovative solutions and future-ready capabilities, we are meeting the need of patients, our people, and the planet. Bengaluru, the city, which is India’s epicenter of Innovation, has been selected to be home to Takeda’s recently launched Innovation Capability Center. We invite you to join our digital transformation journey. In this role, you will have the opportunity to boost your skills and become the heart of an innovative engine that is contributing to global impact and improvement. At Takeda’s ICC we Unite in Diversity: Takeda is committed to creating an inclusive and collaborative workplace, where individuals are recognized for their backgrounds and abilities they bring to our company. We are continuously improving our collaborators journey in Takeda, and we welcome applications from all qualified candidates. Here, you will feel welcomed, respected, and valued as an important contributor to our diverse team. The Opportunity: As a Data Scientist, you will have the opportunity to apply your analytical skills and expertise to extract meaningful insights from vast amounts of data. We are currently seeking a talented and experienced individual to join our team and contribute to our data-driven decision-making process. Objectives: Collaborate with different business users, mainly Supply Chain/Manufacturing to understand the current state and identify opportunities to transform the business into a data-driven organization. Translate processes, and requirements into analytics solutions and metrics with effective data strategy, data quality, and data accessibility for decision making. Operationalize decision support solutions and drive use adoption as well as gathering feedback and metrics on Voice of Customer in order to improve analytics services. Understand the analytics drivers and data to be modeled as well as apply the appropriate quantitative techniques to provide business with actionable insights and ensure analytics model and data are access to the end users to evaluate “what-if” scenarios and decision making. Evaluate the data, analytical models, and experiments periodically to validate hypothesis ensuring it continues to provide business value as requirements and objectives evolve. Accountabilities: Collaborates with business partners in identifying analytical opportunities and developing BI-related goals and projects that will create strategically relevant insights. Work with internal and external partners to develop analytics vision and programs to advance BI solutions and practices. Understands data and sources of data. Strategizes with IT development team and develops a process to collect, ingest, and deliver data along with proper data models for analytical needs. Interacts with business users to define pain points, problem statement, scope, and analytics business case. Develops solutions with recommended data model and business intelligence technologies including data warehouse, data marts, OLAP modeling, dashboards/reporting, and data queries. Works with DevOps and database teams to ensure proper design of system databases and appropriate integration with other enterprise applications. Collaborates with Enterprise Data and Analytics Team to design data model and visualization solutions that synthesize complex data for data mining and discovery. Assists in defining requirements and facilitates workshops and prototyping sessions. Develops and applies technologies such as machine-learning, deep-learning algorithm to enable advanced analytics product functionality. EDUCATION, BEHAVIOURAL COMPETENCIES AND SKILLS : Bachelors’ Degree, from an accredited institution in Data Science, Statistics, Computer Science, or related field. 3+ years of experience with statistical modeling such as clustering, segmentation, multivariate, regression, etc. and analytics tools such as R, Python, Databricks, etc. required Experience in developing and applying predictive and prescriptive modeling, deep-learning, or other machine learning techniques a plus. Hands-on development of AI solutions that comply with industry standards and government regulations. Great numerical and analytical skills, as well as basic knowledge of Python Analytics packages (Pandas, scikit-learn, statsmodels). Ability to build and maintain scalable and reliable data pipelines that collect, transform, manipulate, and load data from internal and external sources. Ability to use statistical tools to conduct data analysis and identify data quality issues throughout the data pipeline. Experience with BI and Visualization tools (f. e. Qlik, Power BI), ETL, NoSQL and proven design skills a plus. Excellent written and verbal communication skills including the ability to interact effectively with multifunctional teams. Experience with working with agile teams. WHAT TAKEDA CAN OFFER YOU: Takeda is certified as a Top Employer, not only in India, but also globally. No investment we make pays greater dividends than taking good care of our people. At Takeda, you take the lead on building and shaping your own career. Joining the ICC in Bengaluru will give you access to high-end technology, continuous training and a diverse and inclusive network of colleagues who will support your career growth. BENEFITS: It is our priority to provide competitive compensation and a benefit package that bridges your personal life with your professional career. Amongst our benefits are Competitive Salary + Performance Annual Bonus Flexible work environment, including hybrid working Comprehensive Healthcare Insurance Plans for self, spouse, and children Group Term Life Insurance and Group Accident Insurance programs Health & Wellness programs including annual health screening, weekly health sessions for employees. Employee Assistance Program 3 days of leave every year for Voluntary Service in additional to Humanitarian Leaves Broad Variety of learning platforms Diversity, Equity, and Inclusion Programs Reimbursements – Home Internet & Mobile Phone Employee Referral Program Leaves – Paternity Leave (4 Weeks) , Maternity Leave (up to 26 weeks), Bereavement Leave (5 calendar days) ABOUT ICC IN TAKEDA: Takeda is leading a digital revolution. We’re not just transforming our company; we’re improving the lives of millions of patients who rely on our medicines every day. As an organization, we are committed to our cloud-driven business transformation and believe the ICCs are the catalysts of change for our global organization. Locations: IND - Bengaluru Worker Type: Employee Worker Sub-Type: Regular Time Type: Full time Show more Show less
Posted 1 month ago
3.0 - 4.0 years
10 - 12 Lacs
Bengaluru
Work from Office
The opportunity: We are seeking a highly skilled and experienced Analytics Specialist to design, develop, and deliver robust data-driven solutions using Power BI, Power Apps, and related Microsoft technologies. The ideal candidate will have strong analytical skills, hands-on experience in AI projects, and a deep understanding of business intelligence tools and data modeling. How you ll make an impact: Design and develop Power BI reports, dashboards, and data models to meet business requirements. Manage the PBI/Power apps/Ai projects independently and work with global stakeholders. Administer Power BI service and integrate reports with other business applications. Create and manage OLAP cubes and tabular models compatible with data warehouse standards. Perform advanced DAX calculations and build efficient data models. Ensure security compliance through implementation of row-level security and access controls. Collaborate with cross-functional teams to understand reporting needs and deliver actionable insights. Maintain documentation and provide knowledge transfer to stakeholders. Contribute to AI-based analytics projects and drive automation using APIs and embedded analytics. Manage and deliver QO monthly performance reports with high accuracy and timeliness. Continuously validate, automate, and improve reporting quality to ensure data integrity and actionable insights. Managing multiple stakeholders across functions and business lines, requiring strong influence skills. Leading projects independently with limited supervision; strong ownership and accountability needed. Integrating data from multiple systems and maintaining reporting consistency. Communicating insights effectively to senior leaders and diverse teams; ability to simplify complex data. Driving and managing analytics/reporting projects end-to-end, including scope, timelines, delivery, and stakeholder engagement. Capture business requirements and transform them into efficient Power BI dashboards, KPI scorecards, and reports. Build and maintain Analysis Services reporting models and develop scalable data models aligned with BI best practices. Interact with BU teams to identify improvement opportunities and implement enhancement strategies. Seek user feedback for enhancements and remain updated with trends in performance and analytics. Responsible to ensure compliance with applicable external and internal regulations, procedures, and guidelines. Living Hitachi Energy s core values of safety and integrity, which means taking responsibility for your own actions while caring for your colleagues and the business. Your background: Graduate/Postgraduate in Engineering, Finance, Business Management, Data Science, Statistics, Mathematics, or similar quantitative field. Minimum 7 years of experience. Power BI (development, DAX, publishing, and scheduling). Hands on experience in Power Apps, SQL Data Warehouse, SSAS, OLAP CUBE, Microsoft Azure, Visual Studio. Exposure to AI and automation projects. Microsoft DA-100 certification preferred. Proficiency in both spoken written English language is required. Qualified individuals with a disability may request a reasonable accommodation if you are unable or limited in your ability to use or access the Hitachi Energy career site as a result of your disability. You may request reasonable accommodations by completing a general inquiry form on our website. Please include your contact information and specific details about your required accommodation to support you during the job application process. .
Posted 1 month ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position –Assistant Manager FP&A CoE Location – Bangalore Role Profile This role is responsible for financial and commercial support under Infrastructure and Cloud, assisting senior management in achieving budgeted and strategic financial goals. Providing analytical and financial modelling support in the context of a variety of key financial indicators and associated analysis and reporting. Job description : Enabling the senior management to achieve the targeted financial results by providing adequate business and commercial support Work with local & global counterparts to build and provide tools/reports & analysis that effectively address the information requirements of the business finance organization as well as other business counterparts Review work adjacencies to identify and highlight scope of using in-house financial information analytics capabilities to provide fast, actionable & accurate analysis Essential Day-to-Day Responsibilities: Run month end processes including transaction tagging, generating reporting, variance analysis and commentary to forecast and budget across all FSVs Understanding and explaining key variances to budget and forecast in a succinct manner to finance/non finance colleagues Understanding of how contracts should be recognised and ensuring our actuals and forecast match the recognition, and challenging contract assumptions where appropriate Working closely with controllership to identify any issues with accounting recognition of contracts Ensure approrpriate recognition of 3rd party contracts takes place in the GL & advise Engineering teams appropriately Working closely with Procurement/Vendor Management on commercial and financial support of contract renewals Build an understanding of Engineering including the key supplier contracts Analysing contract recognition, and working with the business on maintaining a monthly/FY forecast, highlighting risks and opportunities where appropriate Producing and maintaining reporting including vendor analysis Ensuring actuals align with our internal Profit Centre/Cost Centre structure – working closely with I&C FP&A where appropriate Produce ad-hoc analysis where appropriate Additional Information: Time-zone overlap with global counterparts, as warranted Qualification: Post Graduate - MBA, CA/ICWA Preferably 5+ years post qualification in FP&A and/or management reporting/ decision support, Finance partnering, Business support domains Required Skills: Inquisitive mentality – The want to understand reasoning behind variances. Analysing data, taking a step back and understanding why there are variances and digging deeper to find out more information proactively Ability to analyse and provide insight in a way that is easy to communicate and follow Proactivity – Finding out information and data to aid business decisions Forward looking mentality using data, analysis and insights to form conclusions Continuous improvement, changing processes to make processes more efficient Displays curiosity to understand & partner the business / technology better Experienced professional with in-depth knowledge of technical team or specialism Pre requisite - hands on experience with planning/reporting Systems – Oracle, Business Intelligence, SAP Business Objects, Hyperion and/or other OLAP tools Hands on experience on ERP's like, SAP, Oracle Strong background in data analytics and providing actionable insights Proven track record to work with multiple partners across time zones Excellent Knowledge of MS Excel, MS Word, Power Point and MS Access; familiarity with Visual Basic programming for Excel/Access and Power BI would be a plus Strong interpersonal skills – both written and verbal Excellent analytical and problem-solving skills Self Starter with ability to learn quickly and & work without oversight Great teammate, result oriented and well organized Ability to handle meaningful priorities systematically A curious mentality with an ability to understand business context and envisage business flows Desired Skills: Assessment of workflow to mitigate risk & ability to identify & articulate Vertical & Horizontal adjacencies Proven ability to quickly adapt the following Understanding the business context including an understanding of the products and services offerings and their user profiles Understanding of environment, challenges to business and business strategies and ability Understanding current priorities, areas of focus and initiatives/ internal transformations that impact the business LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice. Show more Show less
Posted 1 month ago
4.0 - 9.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. Role: ETL Test Engineer Experience: 4 to 10 years Interview Location: Bangalore Job description: Min 4 to 10 yrs of Exp in ETL Testing. 1. SQL - Expert level of knowledge in core concepts of SQL and query. 2.Lead and mentor a team of ETL testers, providing technical guidance, training, and support in ETL tools, SQL, and test automation frameworks. 3.Create and review complex test cases, test scripts, and test data for ETL processes. 4. ETL Automation - Experience in Datagap, Good to have experience in tools like Informatica, Talend and Ab initio. 5.Execute test cases, validate data transformat ions, and ensure data accuracy and consistency across source and target systems 6. Experience in query optimization, stored procedures/views and functions. 7. Strong familiarity of data warehouse projects and data modeling. 8. Understanding of BI concepts - OLAP vs OLTP and deploying the applications on cloud servers. 9. Preferably good understanding of Design, Development, and enhancement of SQL server DW using tools (SSIS,SSMS, PowerBI/Cognos/Informatica, etc.) 10.Develop and maintain ETL test automation frameworks to enhance testing efficiency and coverage. 11. Integrate automated tests into the CI/CD pipeline to ensure continuous validation of ETL processes. 12. Azure DevOps/JIRA - Hands on experience on any test management tools preferably ADO or JIRA. 13. Agile concepts - Good experience in understanding agile methodology (scrum, lean etc.) 14. Communication - Good communication skills to understand and collaborate with all the stake holders within the project
Posted 1 month ago
5.0 years
5 - 10 Lacs
Noida
On-site
Country/Region: IN Requisition ID: 26208 Work Model: Position Type: Salary Range: Location: INDIA - NOIDA- BIRLASOFT OFFICE Title: Technical Specialist-Data Engg Description: Area(s) of responsibility Must have at least 5+ Years of working experience in ETL Informatica tools. Responsible for designing, developing, and maintaining complex data integration solutions using Informatica tools using Informatica PowerCenter10.x/9.5. PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools –Informatica Server, Repository Server manager. Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, SQL Server Databases. Hands on experience in IICS and IDQ. Extensive experience in developing Stored Procedures, Views, Complex SQL queries using SQL Server and Oracle PL/SQL. Gather the requirements from the business and create detailed technical design documents, and model data flows for complex ETL processes using Informatica PowerCenter Experience in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions. Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documentation. Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, XML,Flat Files into the staging area, ODS, Data Warehouse and Data Mart. Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP. Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support. Ability and experience in managing/ coordinating with Onshore-Offshore teams Skills with M/O flag are part of Specialization Programming/Software Development -PL3 (Functional) Systems Integration And Build -PL3 (Functional) Help the tribe -PL2 (Behavioural) Think Holistically -PL2 (Behavioural) Database Design -PL1 (Functional) Win the Customer -PL2 (Behavioural) Data Visualisation -PL2 (Functional) One Birlasoft -PL2 (Behavioural) Data Management -PL2 (Functional) Results Matter -PL2 (Behavioural) Data Governance -PL1 (Functional) Get Future Ready -PL2 (Behavioural) Requirements Definition And Management -PL2 (Functional) Test Execution -PL2 (Functional) Data Engineering -PL3 (Functional) Data Modelling And Design -PL2 (Functional) MS SQL - PL3 (Mandatory) Python - PL3 (Mandatory) Informatica IICS - PL3 (Mandatory) Spark - PL2 (Optional) Oracle SQL - PL2 (Optional) Oracle PL/SQL - PL2 (Optional) Informatica Power Center - PL3 (Mandatory) Unix Shell Scripting - PL2 (Optional) Unix - PL2 (Optional)
Posted 1 month ago
3.0 years
15 - 20 Lacs
Noida
On-site
Data Engineer II J At Personify Health, we value and celebrate diversity and are committed to creating an inclusive environment for all employees. We believe in creating teams made up of individuals with various backgrounds, experiences, and perspectives. Why? Because diversity inspires innovation and collaboration and challenges us to produce better solutions. But more than this, diversity is our strength, and a catalyst in our ability to #changelivesforgood. Job Summary As a Business Intelligence developer, you'll understand the Schema layer to build complex BI reports and Dashboards with a keen focus on the healthcare and wellbeing industry. Your SQL skills will play a significant role in data manipulation and delivery, and your experience with MicroStrategy will be vital for creating BI tools and reports. This role will help migrate and build new analytics products based on MicroStrategy to support teams with their internal and external reporting for Health Comp data. Essential Functions/Responsibilities/Duties Work closely with Senior Business Intelligence engineer and BI architect to understand the schema objects and build BI reports and Dashboards Participation in sprint refinement, planning, and kick-off to understand the Agile process and Sprint priorities Develop necessary transformations and aggregate tables required for the reporting\Dashboard needs Understand the Schema layer in MicroStrategy and business requirements Develop complex reports and Dashboards in MicroStrategy Investigate and troubleshoot issues with Dashboard and reports Proactively researching new technologies and proposing improvements to processes and tech stack Create test cases and scenarios to validate the dashboards and maintain data accuracy Education and Experience 3 years of experience in Business Intelligence and Data warehousing 3+ years of experience in MicroStrategy Reports and Dashboard development 2 years of experience in SQL Bachelors or master’s degree in IT or Computer Science or ECE. Nice to have – Any MicroStrategy certifications Required Knowledge, Skills, and Abilities Good in writing complex SQL, including aggregate functions, subqueries and complex date calculations and able to teach these concepts to others. Detail oriented and able to examine data and code for quality and accuracy. Self-Starter – taking initiative when inefficiencies or opportunities are seen. Good understanding of modern relational and non-relational models and differences between them Good understanding of Datawarehouse concepts, snowflake & star schema architecture and SCD concepts Good understanding of MicroStrategy Schema objects Develop Public objects such as metrics, filters, prompts, derived objects, custom groups and consolidations in MicroStrategy Develop complex reports and dashboards using OLAP and MTDI cubes Create complex dashboards with data blending Understand VLDB settings and report optimization Understand security filters and connection mappings in MSTR Work Environment At Personify Health, we value and celebrate diversity and are committed to creating an inclusive environment for all employees. We believe in creating teams made up of individuals with various backgrounds, experiences, and perspectives. Diversity inspires innovation and collaboration and challenges us to produce better solutions. But more than this, diversity is our strength and a catalyst in our ability to change lives for the good. Physical Requirements Constantly operates a computer and other office productivity machinery, such as copy machine, computer printer, calculator, etc.
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY_ Consulting _Cloud Testing : Staff The opportunity As a Cloud Test Engineer, you will be responsible for testing cloud Solutions on cloud platform and should ensure Quality of deliverables. You will work closely with Test Lead for the Projects under Test. Testing proficiency in Cloud and cloud platform knowledge either of AWS/Azure/GCP are required for this position. Added advantage to have experience in CI/CD platform, Cloud foundation and cloud data platform. Skills And Attributes For Success Delivery of Testing needs for Cloud Projects. Ability to effectively communicate with team members across geographies effectively Experience in Cloud Infrastructure testing. Sound cloud concepts and ability to suggest options Knowledge in any of the cloud platform (AWS/Azure/GCP). Knowledge in Azure Devops / Jenkins / Pipelines Thorough understanding of Requirements and provide feedback on the requirements. Develop Test Strategy for cloud Projects for various aspects like Platform testing , Application testing , Integration Testing and UAT as needed. Provide inputs for Test Planning aligned with Test Strategy. Perform Test Case design, identify opportunity for Test Automation. Develop Test Cases both Manual and Automation Scripts as required. Ensure Test readiness (Test Environment, Test Data, Tools Licenses etc) Perform Test execution and report the progress. Report defects and liaise with development & other relevant team for defect resolution. Prepare Test Report and provide inputs to Test Lead for Test Sign off/ Closure Provide support in Project meetings/ calls with Client for status reporting. Provide inputs on Test Metrics to Test Lead. Support in Analysis of Metric trends and implementing improvement actions as necessary. Handling changes and conducting Regression Testing Generate Test Summary Reports Co-coordinating Test team members and Development team Interacting with client-side people to solve issues and update status Actively take part in providing Analytics and Advanced Analytics Testing trainings in the company To qualify for the role, you must have BE/BTech/MCA/M.Sc Overall 2 to 6 years of experience in Testing Cloud solutions, minimum 2 years of experience in any of the Cloud solutions built on Azure/AWS/GCP Certifications in cloud area is desirable. Exposure in Spark SQL / Hive QL testing is desirable. Exposure in data migration project from on-premise to cloud platform is desirable. Understanding of business intelligence concepts, architecture & building blocks in areas ETL processing, Datawarehouse, dashboards and analytics. Working experience in scripting languages such as python, java scripts, java. Testing experience in more than one of these areas- Cloud foundation, Devops, Data Quality, ETL, OLAP, Reports Exposure with SQL server or Oracle database and proficiency with SQL scripting. Exposure in backend Testing of Enterprise Applications/ Systems built on different platforms including Microsoft .Net and Sharepoint technologies Exposure in ETL Testing using commercial ETL tools is desirable. Knowledge/ experience in SSRS, Spotfire (SQL Server Reporting Services) and SSIS is desirable. Exposure in Data Transformation Projects, database design concepts & white-box testing is desirable. Ideally, you’ll also have Experience/ exposure to Test Automation and scripting experience in perl & shell is desirable Experience with Test Management and Defect Management tools preferably HP ALM or JIRA Able to contribute as an individual contributor and when required Lead a small Team Able to create Test Strategy & Test Plan for Testing Cloud applications/ solutions that are moderate to complex / high risk Systems Design Test Cases, Test Data and perform Test Execution & Reporting. Should be able to perform Test Management for small Projects as and when required Participate in Defect Triaging and track the defects for resolution/ conclusion Good communication skills (both written & verbal) Good understanding of SDLC, test process in particular Good analytical & problem solving or troubleshooting skills Good understanding of Project Life Cycle and Test Life Cycle. Exposure to CMMi and Process improvement Frameworks is a plus. Should have excellent communication skills & should be able to articulate concisely & clearly Should be ready to do an individual contributor as well as Team Leader role What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Office (Ahmedabad) Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Attri) What do you need for this opportunity? Must have skills required: Python, JS, MLOps, Micro services, SQL, DevOps, Generative AI, AWS, Azure Attri is Looking for: Summary: Our engineering team is growing and we are looking to bring on board a Senior Backend Developer who can help us transition to the next phase of the company. You will be pivotal in refining our system architecture, ensuring the various tech stacks play well with each other, and smoothening the DevOps process. A must have well verse understanding of software paradigm, and curiosity to carve out designs of varying ML, MLOps, and LLMOps problem statements. You will determine to lead your team into right direction towards very end of implementation for underlined project. By joining our team, you will get exposure to working across a swath of modern technologies while building an enterprise-grade ML platform in the most promising area. Responsibilities: Be the bridge between engineering and product teams. Understand long-term product roadmap and architect a system design that will scale with our plans. Take ownership of converting product insights into detailed engineering requirements. Work break-down among team, and orchestrating the development of components for each sprint. Very well verse with solution designing, and documentation (HLD/LLD). Developing "Zero Defect Software" with extreme efficiency by utilizing modern cutting-edge tools (ChatGPT, Co-pilot etc). Adapt, and impart the mindset to build a unit of software that is secured, instrumented, and resilient. Author high-quality, highly-performance, and unit-tested code running on a distributed environment using containers. Continually evaluate and improve DevOps processes for a cloud-native codebase. Strong design skills in defining API Data Contracts / OOAD / Microservices / Data Models and Concurrency concepts. An ardent leader with an obsession for quality, refinement, innovation, and empowering leadership. Qualifications: Work Experience 5-7 years of experience with hands-on experience with development of full fledge Systems/Micro-services using Python, or JS programming. 3+ years experience having Senior engineering responsibilities. 3+ years of people mentorship/leadership experience — managing engineers preferably with good exposure in leading multiple development teams. 3+ years of experience in object-oriented design, and agile development methodologies. Basic experience in developing/deploying cloud-native software using GCP / AWS / Azure. Proven track record building large-scale Product grade (high-throughput, low-latency, and scalable) systems. A well-versed understanding and designing skills of SQL/NoSQL/OLAP DBs. Up-to date with modern cutting-edge technologies to boost efficiency and delivery of team. (Bonus: To have an understanding of Generative AI frameworks/Libraries such RAG, Langchain, LLAMAindex etc.) Skills Strong documentation skills. As a team, we heavily rely on elaborate documentation for everything we are working on. Ability to take authoritative decision, and hold accountability. Ability to motivate, lead, and empower others. Strong independent contributor as well as a team player. Working knowledge of ML and familiarity with concepts of MLOps You will excel in this role if You have a product mindset. You understand, care about, and can relate to our customers. You take ownership, collaborate, and follow through to the very end. You love solving difficult problems, stand your ground, and get what you want from engineers. Resonate with our core values of innovation, curiosity, accountability, trust, fun, and social good. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Experience: 5+ Years Role Overview: Responsible for designing, building, and maintaining scalable data pipelines and architectures. This role requires expertise in SQL, ETL frameworks, big data technologies, cloud services, and programming languages to ensure efficient data processing, storage, and integration across systems. Requirements: • Minimum 5+ years of experience as a Data Engineer or similar data-related role. • Strong proficiency in SQL for querying databases and performing data transformations. • Experience with data pipeline frameworks (e.g., Apache Airflow, Luigi, or custom-built solutions). • Proficiency in at least one programming language such as Python, Java, or Scala for data processing tasks. • Experience with cloud-based data services and Datalakes (e.g., Snowflake, Databricks, AWS S3, GCP BigQuery, or Azure Data Lake). • Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka). • Experience with ETL tools (e.g., Talend, Apache NiFi, SSIS, etc.) and data integration techniques. • Knowledge of data warehousing concepts and database design principles. • Good understanding of NoSQL and Big Data Technologies like MongoDB, Cassandra, Spark, Hadoop, Hive, • Experience with data modeling and schema design for OLAP and OLTP systems. • Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Educational Qualification: Bachelor’s/Master’s degree in computer science, Information Technology, or a related field. Show more Show less
Posted 1 month ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
The Database Test and Tools Development for Linux/Unix OS platforms team is looking for bright and talented engineers to work on Linux on Zseries platform. It is an opportunity to demonstrate your skills as a Test Development Engineer. The team has the unique opportunity to make significant contributions to the Oracle database technology stack testing across different vendor platforms like Zlinux and LoP. Detailed Description and Job Requirements The team works on upcoming releases of the Oracle Database - XML/XDB, Real Application Clusters, Flashback, Oracle Storage Appliance, Automatic Storage Management, Data access, Data Warehouse, Transaction Management, Optimization, Parallel Query, ETL, OLAP, Replication/Streams, Advanced queuing / Messaging, OracleText, Backup/Recovery, High availability and more functional areas The team has good opportunities to learn, identify and work on initiatives to improve productivity, quality, testing infrastructure, and tools for automation. We are looking for engineers with below requirements Requirement: B.E / B.Tech in CS or equivalent with consistently good academic record with 4+ years of experience. Strong in Oracle SQL, PLSQL and Database concepts. Experience with UNIX Operating system. Good in UNIX operating system concepts, commands and services. Knowledge of C/C++ or Java. Experience with Shell scripting, Perl, Python, Proficiency in any one or two. Good communication skills. Good debugging skills.
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About tsworks: tsworks is a leading technology innovator, providing transformative products and services designed for the digital-first world. Our mission is to provide domain expertise, innovative solutions and thought leadership to drive exceptional user and customer experiences. Demonstrating this commitment , we have a proven track record of championing digital transformation for industries such as Banking, Travel and Hospitality, and Retail (including e-commerce and omnichannel), as well as Distribution and Supply Chain, delivering impactful solutions that drive efficiency and growth. We take pride in fostering a workplace where your skills, ideas, and attitude shape meaningful customer engagements. About This Role: tsworks Technologies India Private Limited is seeking driven and motivated Senior Data Engineers to join its Digital Services Team. You will get hands-on experience with projects employing industry-leading technologies. This would initially be focused on the operational readiness and maintenance of existing applications and would transition into a build and maintenance role in the long run. Requirements Position: Data Engineer II Experience: 3 to 10+ Years Location: Bangalore, India Mandatory Required Qualification Strong proficiency in Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Storage, etc. Expertise in DevOps and CI/CD implementation Good knowledge in SQL Excellent Communication Skills In This Role, You Will Design, implement, and manage scalable and efficient data architecture on the Azure cloud platform. Develop and maintain data pipelines for efficient data extraction, transformation, and loading (ETL) processes. Perform complex data transformations and processing using Azure Data Factory, Azure Databricks, Snowflake's data processing capabilities, or other relevant tools. Develop and maintain data models within Snowflake and related tools to support reporting, analytics, and business intelligence needs. Collaborate with cross-functional teams to understand data requirements and design appropriate data integration solutions. Integrate data from various sources, both internal and external, ensuring data quality and consistency. Ensure data models are designed for scalability, reusability, and flexibility. Implement data quality checks, validations, and monitoring processes to ensure data accuracy and integrity across Azure and Snowflake environments. Adhere to data governance standards and best practices to maintain data security and compliance. Handling performance optimization in ADF and Snowflake platforms Collaborate with data scientists, analysts, and business stakeholders to understand data needs and deliver actionable insights Provide guidance and mentorship to junior team members to enhance their technical skills. Maintain comprehensive documentation for data pipelines, processes, and architecture within both Azure and Snowflake environments including best practices, standards, and procedures. Skills & Knowledge Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 3 + Years of experience in Information Technology, designing, developing and executing solutions. 3+ Years of hands-on experience in designing and executing data solutions on Azure cloud platforms as a Data Engineer. Strong proficiency in Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Storage, etc. Familiarity with Snowflake data platform would be an added advantage. Hands-on experience in data modelling, batch and real-time pipelines, using Python, Java or JavaScript and experience working with Restful APIs are required. Expertise in DevOps and CI/CD implementation. Hands-on experience with SQL and NoSQL databases. Hands-on experience in data modelling, implementation, and management of OLTP and OLAP systems. Experience with data modelling concepts and practices. Familiarity with data quality, governance, and security best practices. Knowledge of big data technologies such as Hadoop, Spark, or Kafka. Familiarity with machine learning concepts and integration of ML pipelines into data workflows Hands-on experience working in an Agile setting. Is self-driven, naturally curious, and able to adapt to a fast-paced work environment. Can articulate, create, and maintain technical and non-technical documentation. Public cloud certifications are desired. Show more Show less
Posted 1 month ago
7.0 - 11.0 years
17 - 32 Lacs
Chennai
Work from Office
Location : Chennai (Mandatory)@ Customer location - 5 days from office Exp. : 8-10 Years JD: Data Modeller Should have real time data modelling experience into 2-3 projects Hands-on experience in data modelling for both OLTP and OLAP systems. In-depth knowledge of Conceptual, Logical, and Physical data modelling. Strong understanding of indexing, partitioning, and data sharding with practical experience. Experience in identifying and addressing factors affecting database performance for near-real-time reporting and application interaction. Proficiency with at least one data modelling tool (preferably DBSchema). Functional knowledge of the mutual fund industry is a plus. Familiarity with GCP databases like AlloyDB, CloudSQL, and BigQuery. Willingness to work from Chennai (office presence is mandatory). Exp- Currrent CTC Expected CTC NP-
Posted 1 month ago
7.0 years
20 - 25 Lacs
Ahmedabad, Gujarat, India
On-site
Experience : 7.00 + years Salary : INR 2000000-2500000 / year (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Hybrid (Ahmedabad) Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Inferenz) What do you need for this opportunity? Must have skills required: Snowflake, ETL, Cloud (AWS and GCP) Inferenz is Looking for: Position: Lead Data Engineer (Snowflake) Location: Pune, Ahmedabad Required Experience: 7 to 10 years Preferred: Immediate Joiners Job Overview: We are looking for a highly skilled Lead Data Engineer (Snowflake) to join our team. The ideal candidate will have extensive experience Snowflake, and cloud platforms, with a strong understanding of ETL processes, data warehousing concepts, and programming languages. If you have a passion for working with large datasets, designing scalable database schemas, and solving complex data problems. Key Responsibilities: Design, develop, and optimize data pipelines using Snowflake, and ELT/ETL tools. Architect, implement, and maintain data warehouse solutions while ensuring high performance and scalability. Design and develop efficient database schemas and data models to support business needs. Write and optimize complex SQL queries for data processing and reporting. Work with Python, C#, or Java to develop data transformation scripts and automate processes. Ensure data integrity, security, and governance throughout the data lifecycle. Analyze, troubleshoot, and resolve data-related issues at tactical, functional, and strategic levels. Collaborate with cross-functional teams to understand business requirements and deliver data-driven solutions. Qualifications: Strong experience with Snowflake. Deep understanding of transactional databases, OLAP, and data warehousing concepts. Experience in designing database schemas and data models. Proficiency in one programming language (Python, C#, or Java). Strong problem-solving and analytical skills. Preferred Skills: Snowpro Core or Snowpro Advanced certificate. Experience with cost/performance optimization. Client-facing experience with the ability to understand business needs. Ability to work collaboratively in a team environment. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 month ago
10.0 - 15.0 years
13 - 18 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
As a Sr. Staff Engineer on the Data Engineering Team you'll be working on some of the hardest problems in the field of Data, Cloud and Security with a mission to achieve the highest standards of customer success. You will be building blocks of technology that will define Netskope s future. You will leverage open source Technologies around OLAP, OLTP, Streaming, Big Data and ML models. You will help design, and build an end-to-end system to manage the data and infrastructure used to improve security insights for our global customer base. You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Conceiving and building services used by Netskope products to validate, transform, load and perform analytics of large amounts of data using distributed systems with cloud scale and reliability. Helping other teams architect their applications using services from the Data team wile using best practices and sound designs. Evaluating many open source technologies to find the best fit for our needs, and contributing to some of them. Working with the Application Development and Product Management teams to scale their underlying services Providing easy-to-use analytics of usage patterns, anticipating capacity issues and helping with long term planning Learning about and designing large-scale, reliable enterprise services. Working with great people in a fun, collaborative environment. Creating scalable data mining and data analytics frameworks using cutting edge tools and techniques Required skills and experience 10+ years of industry experience building highly scalable distributed Data systems Programming experience in Python, Java or Golang Excellent data structure and algorithm skills Proven good development practices like automated testing, measuring code coverage. Proven experience developing complex Data Platforms and Solutions using Technologies like Kafka, Kubernetes, MySql, Hadoop, Big Query and other open source databases Experience designing and implementing large, fault-tolerant and distributed systems around columnar data stores. Excellent written and verbal communication skills Bonus points for contributions to the open source community Education BSCS or equivalent required, MSCS or equivalent strongly preferred
Posted 1 month ago
8.0 - 10.0 years
25 - 30 Lacs
Mumbai
Work from Office
Job Description: As a Backend (Java) Engineer, you would be part of the team consisting of Scrum Master, Cloud Engineers, AI/ML Engineers, and UI/UX Engineers to build end-to-end Data to Decision Systems. Mandatory: 8+ years of demonstrable experience designing, building, and working as a Java Developer for enterprise web applications Ideally, this would include the following: o Expert-level proficiency with Java o Expert-level proficiency with SpringBoot Familiarity with common databases (RDBMS such as MySQL & NoSQL such as MongoDB) and data warehousing concepts (OLAP, OLTP) Understanding of REST concepts and building/interacting with REST APIs Deep understanding of core backend concepts: o Develop and design RESTful services and APIs o Develop functional databases, applications, and servers to support websites on the back end o Performance optimization and multithreading concepts o Experience with deploying and maintaining high traffic infrastructure (performance testing is a plus) In addition, the ideal candidate would have great problem-solving skills, and familiarity with code versioning tools such as GitHub Good to have: Familiarity with Microsoft Azure Cloud Services (particularly Azure Web App, Storage and VM), or familiarity with AWS (EC2 containers) or GCP Services. Experience with Microservices, Messaging Brokers (e.g., RabbitMQ) Experience with fine-tuning reverse proxy engines such as Nginx, Apache HTTPD
Posted 1 month ago
6.0 - 11.0 years
11 - 15 Lacs
Chennai
Work from Office
Design and develop complex T-SQL queries and stored procedures for data extraction, transformation, and reporting. Build and manage ETL workflows using SSIS to support data integration across multiple sources. Create interactive dashboards and reports using Tableau and SSRS for business performance monitoring. Develop and maintain OLAP cubes using SSAS for multidimensional data analysis. Collaborate with business and data teams to understand reporting requirements and deliver scalable BI solutions. Apply strong data warehouse architecture and modeling concepts to build efficient data storage and retrieval systems. Perform performance tuning and query optimization for large datasets and improve system responsiveness. Ensure data quality, consistency, and integrity through robust validation and testing processes. Maintain documentation for data pipelines, ETL jobs, and reporting structures. Stay updated with emerging Microsoft BI technologies and best practices to continuously improve solutions. Skills Must have At least 6 years of experience with T-SQL and SQL Server (SSIS and SSRS exposure is a must. Proficient with Tableau, preferably with at least 4 years of experience creating dashboards. Experience working with businesses and delivering dashboards to senior management. Working within Data Warehouse architecture experience is a must. Exposure to Microsoft BI. Nice to have N/A Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Software Engineer Other Scripting Languages Australia Sydney Senior Mobile Developer (iOS&Android) Other Scripting Languages India Remote India Snowflake Developer with Tableau Other Scripting Languages India Gurugram Chennai, India Req. VR-114509 Other Scripting Languages BCM Industry 23/05/2025 Req. VR-114509 Apply for MS SQL Server Developer with Tableau in Chennai *
Posted 1 month ago
6.0 - 11.0 years
11 - 15 Lacs
Gurugram
Work from Office
Design and develop complex T-SQL queries and stored procedures for data extraction, transformation, and reporting. Build and manage ETL workflows using SSIS to support data integration across multiple sources. Create interactive dashboards and reports using Tableau and SSRS for business performance monitoring. Develop and maintain OLAP cubes using SSAS for multidimensional data analysis. Collaborate with business and data teams to understand reporting requirements and deliver scalable BI solutions. Apply strong data warehouse architecture and modeling concepts to build efficient data storage and retrieval systems. Perform performance tuning and query optimization for large datasets and improve system responsiveness. Ensure data quality, consistency, and integrity through robust validation and testing processes. Maintain documentation for data pipelines, ETL jobs, and reporting structures. Stay updated with emerging Microsoft BI technologies and best practices to continuously improve solutions. Skills Must have At least 6 years of experience with T-SQL and SQL Server (SSIS and SSRS exposure is a must. Proficient with Tableau, preferably with at least 4 years of experience creating dashboards. Experience working with businesses and delivering dashboards to senior management. Working within Data Warehouse architecture experience is a must. Exposure to Microsoft BI. Nice to have N/A Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Software Engineer Other Scripting Languages Australia Sydney Senior Mobile Developer (iOS&Android) Other Scripting Languages India Remote India Snowflake Developer with Tableau Other Scripting Languages India Chennai Gurugram, India Req. VR-114509 Other Scripting Languages BCM Industry 23/05/2025 Req. VR-114509 Apply for MS SQL Server Developer with Tableau in Gurugram *
Posted 1 month ago
6.0 - 11.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Design and develop complex T-SQL queries and stored procedures for data extraction, transformation, and reporting. Build and manage ETL workflows using SSIS to support data integration across multiple sources. Create interactive dashboards and reports using Tableau and SSRS for business performance monitoring. Develop and maintain OLAP cubes using SSAS for multidimensional data analysis. Collaborate with business and data teams to understand reporting requirements and deliver scalable BI solutions. Apply strong data warehouse architecture and modeling concepts to build efficient data storage and retrieval systems. Perform performance tuning and query optimization for large datasets and improve system responsiveness. Ensure data quality, consistency, and integrity through robust validation and testing processes. Maintain documentation for data pipelines, ETL jobs, and reporting structures. Stay updated with emerging Microsoft BI technologies and best practices to continuously improve solutions. Skills Must have At least 6 years of experience with T-SQL and SQL Server (SSIS and SSRS exposure is a must. Proficient with Tableau, preferably with at least 4 years of experience creating dashboards. Experience working with businesses and delivering dashboards to senior management. Working within Data Warehouse architecture experience is a must. Exposure to Microsoft BI. Nice to have N/A Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Software Engineer Other Scripting Languages Australia Sydney Senior Mobile Developer (iOS&Android) Other Scripting Languages India Remote India Snowflake Developer with Tableau Other Scripting Languages India Gurugram Bengaluru, India Req. VR-114509 Other Scripting Languages BCM Industry 23/05/2025 Req. VR-114509 Apply for MS SQL Server Developer with Tableau in Bengaluru *
Posted 1 month ago
10.0 - 15.0 years
35 - 40 Lacs
Mumbai
Work from Office
Its fun to work in a company where people truly BELIEVE in what they are doing! Were committed to bringing passion and customer focus to the business. Job Description: As a Backend (Java) Engineer, you would be part of the team consisting of Scrum Master, Cloud Engineers, AI/ML Engineers, and UI/UX Engineers to build end-to-end Data to Decision Systems. Mandatory: 8+ years of demonstrable experience designing, building, and working as a Java Developer for enterprise web applications Ideally, this would include the following: o Expert-level proficiency with Java o Expert-level proficiency with SpringBoot Familiarity with common databases (RDBMS such as MySQL & NoSQL such as MongoDB) and data warehousing concepts (OLAP, OLTP) Understanding of REST concepts and building/interacting with REST APIs Deep understanding of core backend concepts: o Develop and design RESTful services and APIs o Develop functional databases, applications, and servers to support websites on the back end o Performance optimization and multithreading concepts o Experience with deploying and maintaining high traffic infrastructure (performance testing is a plus) In addition, the ideal candidate would have great problem-solving skills, and familiarity with code versioning tools such as GitHub Good to have: Familiarity with Microsoft Azure Cloud Services (particularly Azure Web App, Storage and VM), or familiarity with AWS (EC2 containers) or GCP Services. Experience with Microservices, Messaging Brokers (e.g., RabbitMQ) Experience with fine-tuning reverse proxy engines such as Nginx, Apache HTTPD If you like wild growth and working with happy, enthusiastic over-achievers, youll enjoy your career with us! Not the right fit? Let us know youre interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!
Posted 1 month ago
8.0 years
0 Lacs
India
On-site
Staff Software Engineer, Data Ingestion The Staff Software Engineer, Data Ingestion will be a critical individual contributor responsible for designing, developing, and maintaining robust and scalable data pipelines. This role is at the heart of our data ecosystem, deliver new analytical software solution to access timely, accurate, and complete data for insights, products, and operational efficiency. Key Responsibilities: Design, develop, and maintain high-performance, fault-tolerant data ingestion pipelines using Python. Integrate with diverse data sources (databases, APIs, streaming platforms, cloud storage, etc.). Implement data transformation and cleansing logic during ingestion to ensure data quality. Monitor and troubleshoot data ingestion pipelines, identifying and resolving issues promptly. Collaborate with database engineers to optimize data models for fast consumption. Evaluate and propose new technologies or frameworks to improve ingestion efficiency and reliability. Develop and implement self-healing mechanisms for data pipelines to ensure continuity. Define and uphold SLAs and SLOs for data freshness, completeness, and availability. Participate in on-call rotation as needed for critical data pipeline issues Key Skills: 8+ years of experience, ideally with an engineering background, working in software product companies. Extensive Python Expertise: Extensive experience in developing robust, production-grade applications with Python. Data Collection & Integration: Proven experience collecting data from various sources (REST APIs, OAuth, GraphQL, Kafka, S3, SFTP, etc.). Distributed Systems & Scalability: Strong understanding of distributed systems concepts, designing for scale, performance optimization, and fault tolerance. Cloud Platforms: Experience with major cloud providers (AWS or GCP) and their data-related services (e.g., S3, EC2, Lambda, SQS, Kafka, Cloud Storage, GKE). Database Fundamentals: Solid understanding of relational databases (SQL, schema design, indexing, query optimization). OLAP database experience is a plus (Hadoop) Monitoring & Alerting: Experience with monitoring tools (e.g., Prometheus, Grafana) and setting up effective alerts. Version Control: Proficiency with Git. Containerization (Plus): Experience with Docker and Kubernetes. Streaming Technologies (Plus): Experience with real-time data processing using Kafka, Flink, Spark Streaming. Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Strategy and Transactions - SaT – DnA Associate Manager EY’s Data n’ Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors. The opportunity We’re looking for Associate Manager - Data Engineering. The main objective of the role is to support cloud and on-prem platform analytics and data engineering projects initiated across engagement teams. The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also on-shore facing. This role will be instrumental in designing, developing, and evolving the modern data warehousing solutions and data integration build-outs using cutting edge tools and platforms for both on-prem and cloud architectures. In this role you will be coming up with design specifications, documentation, and development of data migration mappings and transformations for a modern Data Warehouse set up/data mart creation and define robust ETL processing to collect and scrub both structured and unstructured data providing self-serve capabilities (OLAP) in order to create impactful decision analytics reporting. Your Key Responsibilities Evaluating and selecting data warehousing tools for business intelligence, data population, data management, metadata management and warehouse administration for both on-prem and cloud based engagements Strong working knowledge across the technology stack including ETL, ELT, data analysis, metadata, data quality, audit and design Design, develop, and test in ETL tool environment (GUI/canvas driven tools to create workflows) Experience in design documentation (data mapping, technical specifications, production support, data dictionaries, test cases, etc.) Provides technical leadership to a team of data warehouse and business intelligence developers Coordinate with other technology users to design and implement matters of data governance, data harvesting, cloud implementation strategy, privacy, and security Adhere to ETL/Data Warehouse development Best Practices Responsible for Data orchestration, ingestion, ETL and reporting architecture for both on-prem and cloud ( MS Azure/AWS/GCP) Assisting the team with performance tuning for ETL and database processes Skills And Attributes For Success Minimum of 7 years of total experience with 3+ years in Data warehousing/ Business Intelligence field Solid hands-on 3+ years of professional experience with creation and implementation of data warehouses on client engagements and helping create enhancements to a data warehouse Strong knowledge of data architecture for staging and reporting schemas ,data models and cutover strategies using industry standard tools and technologies Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) Minimum 3+ years experience in Azure database offerings [ Relational, NoSQL, Datawarehouse ] 2+ years hands-on experience in various Azure services preferred – Azure Data Factory,Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics ,Azure Analysis Services & Databricks Minimum of 3 years of hands-on database design, modeling and integration experience with relational data sources, such as SQL Server databases ,Oracle/MySQL, Azure SQL and Azure Synapse Strong in PySpark, SparkSQL Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem solving skills Excellent written and verbal communication skills To qualify for the role, you must have Bachelor’s or equivalent degree in computer science, or related field, required. Advanced degree or equivalent business experience preferred Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analyzing large volumes of data Relevant work experience of minimum 6 to 8 years in a big 4 or technology/ consulting set up Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Experience in Snowflake What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
9.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Strategy and Transactions - SaT– DnA Manager EY’s Data n’ Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors. The opportunity We’re looking for Manager - Data Engineering. The main objective of the role is to support cloud and on-prem platform analytics and data engineering projects initiated across engagement teams. The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also on-shore facing. This role will be instrumental in designing, developing, and evolving the modern data warehousing solutions and data integration build-outs using cutting edge tools and platforms for both on-prem and cloud architectures. In this role you will be coming up with design specifications, documentation, and development of data migration mappings and transformations for a modern Data Warehouse set up/data mart creation and define robust ETL processing to collect and scrub both structured and unstructured data providing self-serve capabilities (OLAP) in order to create impactful decision analytics reporting. Your Key Responsibilities Evaluating and selecting data warehousing tools for business intelligence, data population, data management, metadata management and warehouse administration for both on-prem and cloud based engagements Strong working knowledge across the technology stack including ETL, ELT, data analysis, metadata, data quality, audit and design Design, develop, and test in ETL tool environment (GUI/canvas driven tools to create workflows) Experience in design documentation (data mapping, technical specifications, production support, data dictionaries, test cases, etc.) Provides technical leadership to a team of data warehouse and business intelligence developers Coordinate with other technology users to design and implement matters of data governance, data harvesting, cloud implementation strategy, privacy, and security Adhere to ETL/Data Warehouse development Best Practices Responsible for Data orchestration, ingestion, ETL and reporting architecture for both on-prem and cloud ( MS Azure/AWS/GCP) Assisting the team with performance tuning for ETL and database processes Skills And Attributes For Success 9-11 years of total experience with 5+ years in Data warehousing/ Business Intelligence field Solid hands-on 5+ years of professional experience with creation and implementation of data warehouses on client engagements and helping create enhancements to a data warehouse Strong knowledge of data architecture for staging and reporting schemas, data models and cutover strategies using industry standard tools and technologies Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) Minimum 3+ years’ experience in Azure database offerings [ Relational, NoSQL, Datawarehouse] 3+ years hands-on experience in various Azure services preferred – Azure Data Factory, Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics, Azure Analysis Services & Databricks Minimum of 5 years of hands-on database design, modeling and integration experience with relational data sources, such as SQL Server databases, Oracle/MySQL, Azure SQL and Azure Synapse Strong in PySpark, SparkSQL Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem-solving skills Excellent written and verbal communication skills To qualify for the role, you must have Bachelor’s or equivalent degree in computer science, or related field, required. Advanced degree or equivalent business experience preferred Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analysing large volumes of data Relevant work experience of minimum 9 to 11 years in a big 4 or technology/ consulting set up Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Experience with Snowflake What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
8.0 - 13.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Job Overview: We are looking for a BI & Visualization Developer who will be part of our Analytics Practice and will be expected to actively work in a multi-disciplinary fast paced environment. This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project; its primary responsibility is to support the design, development and maintainance of business intelligence and analytics solutions. Responsibilities: Develop reports, dashboards, and advanced visualizations. Works closely with the product managers, business analysts, clients etc. to understand the needs / requirements and develop visualizations needed. Provide support to new of existing applications while recommending best practices and leading projects to implement new functionality. Learn and develop new visualization techniques as required to keep up with the contemporary visualization design and presentation. Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Collaborate in design reviews and code reviews to ensure standards are met. Recommend new standards for visualizations. Build and reuse template/components/web services across multiple dashboards Support presentations to Customers and Partners Advising on new technology trends and possible adoption to maintain competitive advantage Mentoring Associates Experience Needed: 8+ years of related experience is required. A Bachelor degree or Masters degree in Computer Science or related technical discipline is required Highly skilled in data visualization tools like PowerBI, Tableau, Qlikview etc. Very Good Understanding of PowerBI Tabular Model/Azure Analysis Services using large datasets. Strong SQL coding experience with performance optimization experience for data queries. Understands different data models like normalized, de-normalied, stars, and snowflake models. Worked in big data environments, cloud data stores, different RDBMS and OLAP solutions. Experience in design, development, and deployment of BI systems. Candidates with ETL experience preferred. Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in service delivery. Has strong technical background and remains evergreen with technology and industry developments. Additional Demonstrated ability to have successfully completed multiple, complex technical projects Prior experience with application delivery using an Onshore/Offshore model Experience with business processes across multiple Master data domains in a services based company Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. Demonstrates high standards of professional behavior in dealings with clients, colleagues and staff. Strong written communication skills. Is effective and persuasive in both written and oral communication. Experience with gathering end user requirements and writing technical documentation Time management and multitasking skills to effectively meet deadlines under time-to-market pressure May require occasional travel
Posted 1 month ago
8.0 - 13.0 years
7 - 12 Lacs
Noida
Work from Office
Job Overview We are looking for a Data Engineer who will be part of our Analytics Practice and will be expected to actively work in a multi-disciplinary fast paced environment. This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project; its primary responsibility is the acquisition, transformation, loading and processing of data from a multitude of disparate data sources, including structured and unstructured data for advanced analytics and machine learning in a big data environment. Responsibilities: Engineer a modern data pipeline to collect, organize, and process data from disparate sources. Performs data management tasks, such as conduct data profiling, assess data quality, and write SQL queries to extract and integrate data Develop efficient data collection systems and sound strategies for getting quality data from different sources Consume and analyze data from the data pool to support inference, prediction and recommendation of actionable insights to support business growth. Design and develop ETL processes using tools and scripting. Troubleshoot and debug ETL processes. Performance tuning and opitimization of the ETL processes. Provide support to new of existing applications while recommending best practices and leading projects to implement new functionality. Collaborate in design reviews and code reviews to ensure standards are met. Recommend new standards for visualizations. Learn and develop new ETL techniques as required to keep up with the contemporary technologies. Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Support presentations to Customers and Partners Advising on new technology trends and possible adoption to maintain competitive advantage Experience Needed: 8+ years of related experience is required. A BS or Masters degree in Computer Science or related technical discipline is required ETL experience with data integration to support data marts, extracts and reporting Experience connecting to varied data sources Excellent SQL coding experience with performance optimization for data queries. Understands different data models like normalized, de-normalied, stars, and snowflake models. Worked with transactional, temporarl, time series, and structured and unstructured data. Experience on Azure Data Factory and Azure Synapse Analytics Worked in big data environments, cloud data stores, different RDBMS and OLAP solutions. Experience in cloud-based ETL development processes. Experience in deployment and maintenance of ETL Jobs. Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in service delivery. Has strong technical background and remains evergreen with technology and industry developments. At least 3 years of demonstrated success in software engineering, release engineering, and/or configuration management. Highly skilled in scripting languages like PowerShell. Substantial experience in the implementation and exectuion fo CI/CD processes. Additional Demonstrated ability to have successfully completed multiple, complex technical projects Prior experience with application delivery using an Onshore/Offshore model Experience with business processes across multiple Master data domains in a services based company Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. Demonstrates high standards of professional behavior in dealings with clients, colleagues and staff. Is able to make sound and far reaching decisions alone on major issues and to take full responsibility for them on a technical basis. Strong written communication skills. Is effective and persuasive in both written and oral communication. Experience with gathering end user requirements and writing technical documentation Time management and multitasking skills to effectively meet deadlines under time-to-market pressure
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France