Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 16.0 years
0 Lacs
noida, uttar pradesh
On-site
Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers to levels they cannot achieve anywhere else. This is a world of more possibilities, more innovation, more openness in a cloud-enabled world. The Business & Industry Copilots group is a rapidly growing organization that is responsible for the Microsoft Dynamics 365 suite of products, Power Apps, Power Automate, Dataverse, AI Builder, Microsoft Industry Solution and more. Microsoft is considered one of the leaders in Software as a Service in the world of business applications and this organization is at the heart of how business applications are designed and delivered. This is an exciting time to join our group Customer Experience (CXP) and work on something highly strategic to Microsoft. The goal of CXP Engineering is to build the next generation of our applications running on Dynamics 365, AI, Copilot, and several other Microsoft cloud services to drive AI transformation across Marketing, Sales, Services and Support organizations within Microsoft. We innovate quickly and collaborate closely with our partners and customers in an agile, high-energy environment. Leveraging the scalability and value from Azure & Power Platform, we ensure our solutions are robust and efficient. Our organizations implementation acts as reference architecture for large companies and helps drive product capabilities. If the opportunity to collaborate with a diverse engineering team, on enabling end-to-end business scenarios using cutting-edge technologies and to solve challenging problems for large scale 24x7 business SaaS applications excite you, please come and talk to us! We are hiring a passionate Principal SW Engineering Manager to lead a team of highly motivated and talented software developers building highly scalable data platforms and deliver services and experiences for empowering Microsofts customer, seller and partner ecosystem to be successful. This is a unique opportunity to use your leadership skills and experience in building core technologies that will directly affect the future of Microsoft on the cloud. In this position, you will be part of a fun-loving, diverse team that seeks challenges, loves learning and values teamwork. You will collaborate with team members and partners to build high-quality and innovative data platforms with full stack data solutions using latest technologies in a dynamic and agile environment and have opportunities to anticipate future technical needs of the team and provide technical leadership to keep raising the bar for our competition. We use industry-standard technology: C#, JavaScript/Typescript, HTML5, ETL/ELT, Data warehousing, and/ or Business Intelligence Development. Responsibilities As a leader of the engineering team, you will be responsible for the following: - Build and lead a world class data engineering team. - Passionate about technology and obsessed about customer needs. - Champion data-driven decisions for features identification, prioritization and delivery. - Managing multiple projects, including timelines, customer interaction, feature tradeoffs, etc. - Delivering on an ambitious product and services roadmap, including building new services on top of vast amount data collected by our batch and near real time data engines. - Design and architect internet scale and reliable services. - Leveraging machine learning (ML) models knowledge to select appropriate solutions for business objectives. - Communicate effectively and build relationship with our partner teams and stakeholders. - Help shape our long-term architecture and technology choices across the full client and services stack. - Understand the talent needs of the team and help recruit new talent. - Mentoring and growing other engineers to bring in efficiency and better productivity. - Experiment with and recommend new technologies that simplify or improve the tech stack. - Work to help build an inclusive working environment. Qualifications Basic Qualifications: - Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. - 12+ years of experience of building high scale enterprise Business Intelligence and data engineering solutions. - 3+ years of management experience leading a high-performance engineering team. - Proficient in designing and developing distributed systems on cloud platform. - Must be able to plan work, and work to a plan adapting as necessary in a rapidly evolving environment. - Experience using a variety of data stores, including data ETL/ELT, warehouses, RDBMS, in-memory caches, and document Databases. - Experience using ML, anomaly detection, predictive analysis, exploratory data analysis. - A strong understanding of the value of Data, data exploration and the benefits of a data-driven organizational culture. - Strong communication skills and proficiency with executive communications - Demonstrated ability to effectively lead and operate in cross-functional global organization Preferred Qualifications - Prior experience as an engineering site leader is a strong plus. - Proven success in recruiting and scaling engineering organizations effectively. - Demonstrated ability to provide technical leadership to teams, with experience managing large-scale data engineering projects. - Hands-on experience working with large data sets using tools such as SQL, Databricks, PySparkSQL, Synapse, Azure Data Factory, or similar technologies. - Expertise in one or more of the following areas: AI and Machine Learning. - Experience with Business Intelligence or data visualization tools, particularly Power BI, is highly beneficial,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be joining our dynamic team as an Azure Data Engineer - L3 with 5-7 years of experience, based in either Hyderabad or Bangalore, working the shift timings of 2PM-11PM IST. Your responsibilities will include: - Utilizing your expertise in Azure Data Factory, Databricks, Azure data lake, and Azure SQL Server. - Developing ETL/ELT processes using SSIS and/or Azure Data Factory. - Building complex pipelines and dataflows with Azure Data Factory. - Designing and implementing data pipelines using Azure Data Factory (ADF). - Enhancing the functionality and performance of existing data pipelines. - Fine-tuning processes dealing with very large data sets. - Configuring and Deploying ADF packages. - Proficient in the usage of ARM Template, Key Vault, Integration runtime. - Adaptable to working with ETL frameworks and standards. - Demonstrating strong analytical and troubleshooting skills to identify root causes and find solutions. - Proposing innovative and feasible solutions for business requirements. - Knowledge of Azure technologies/services such as Blob storage, ADLS, Logic Apps, Azure SQL, and Web Jobs. - Expertise in ServiceNow, Incidents, JIRA. - Exposure to agile methodology. - Proficiency in understanding and building PowerBI reports using the latest methodologies. Your key skills should include: - Azure - Azure Data Factory - Data bricks - Migration project experience Qualifications: - Engineer graduate Certifications: - Preferable: Azure certification, Data bricks Join us and be a part of our exciting journey as we continue to provide end-to-end solutions in various industry verticals with a global presence and a track record of successful project deliveries for Fortune 500 companies.,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
nagpur, maharashtra
On-site
We are looking for enthusiastic and honest individuals driven with a passion for creating impeccable experiences for customers. Whatever your main role is, you will have the flexibility and responsibility to explore your ideas. Employ your enthusiasm, drive, and innovation to assist some of the biggest and most successful organizations in changing their businesses. Positive work culture is the most important drawing card for high-demand talent at BizTranSights. Our supportive team welcomes each member with zeal. We have a diverse and inclusive work environment based on mutual trust, respect, and dedication to top performance. We appreciate new visions, acknowledge our employees" abilities and interests, and give them room to grow. Join our team and be a part of a dynamic environment: - Cloud Resources Managed: 0+ - Daily Transactions Managed: 0+ - Business Processes Simplified: 0+ - Costs Saved: $0 M+ Why work with Us: 1. Learn: - Things You'll Learn: Challenge yourself daily, solve real-time problems, think strategically, and help each other expand. 2. Growth: - Experience Growth: Gain new experiences and skills to positively impact your current position and any role you are likely to pursue. 3. Work-Life Balance: - Strike The Balance: Achieve your goals and priorities while maintaining a harmonious work-life balance. 4. Latest Technology: - Access to the latest technology for increased productivity and efficiency. We are Hiring! We are looking for top-notch individuals who are problem solvers, driven, dedicated, and team players with big goals. If you are seeking a team that will welcome you, collaborate with you, challenge you, and help develop your skills to the next level, we want to hear from you. Position: MSBI Developer Ideal Candidate: - 2+ years of experience in ETL/ELT projects using SSIS. - Strong experience in Microsoft SQL Server, Database Design, and Database Modeling. - Experience in Power BI development, including designing & developing dashboards, reports, and data visualization tools. - Experience designing integration scripts using C#.NET. - Experience with troubleshooting & debugging. - Good understanding of the Software Development Life Cycle Process. - Excellent communication, documentation, and interpersonal skills, with problem-solving abilities. - Proactive, highly motivated attitude, detail-oriented, and quick learner. - Experience with software testing, JIRA bug tracking. - General understanding or development exposure in Data Science & Cloud Computing (Azure/AWS) is a plus. - Experience with Agile software Development methodology is a plus. - Minimum Qualification: ME, MTech, BE, BTech, MCA, MSc (Comp. Science). Exciting work environment in a fast-growing company with ample opportunities to thrive in your career working on the latest technologies alongside a passionate team of senior developers in India and the USA. We are looking to add another member to our team who is dependable, detail-oriented, has good communication skills, and is available to conduct calls with USA-based multiple stakeholders/team members. The location for this position is Nagpur.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
The Data Platform Team is responsible for building critical platforms that abstract lower-level complexities, enabling engineers to focus solely on business logic. Platforms like the Events Choreography Platform and the Configurable Ingestion Platform have a significant impact across multiple teams, therefore, we maintain high standards for stability, availability, and scalability. As a member of the Data Platform Team, your responsibilities will include defining and driving technical strategy in collaboration with managers to prioritize and shape the team's roadmap. You will lead high-impact projects, handling ambiguous problem statements and delivering scalable solutions. Owning system design and architecture will be crucial to ensure reliability, observability, and adherence to best practices. Additionally, mentoring and leading engineers to foster growth, technical excellence, and high-quality execution will be part of your role. You will bridge cross-functional gaps by coordinating between stakeholders to align on technical decisions. Regularly reviewing designs, documents, and PRs to maintain high code and system quality is also expected. Developing a deep understanding of the platforms to ensure comprehensive ownership and being willing to get hands-on with coding and deep dives are essential for success in this role. To be considered for this position, you should have 7-9 years of experience in software engineering, with at least 4 years of experience in leading small to medium-sized teams. Expertise in scalable, distributed systems with strong problem-solving skills is required. Hands-on experience with observability, developer experience, and performance optimizations is highly valued. A strong ability to navigate ambiguity, break down complex problems, and drive solutions is crucial. You should measure success by the impact you create rather than the number of lines of code you ship. Being proactive and taking charge to become a human orchestrator, connecting dots between different stakeholders and team members when required, is important. We are looking for individuals with a high level of independence who can function both independently and collaboratively. Experience with data-intensive applications (ETL/ELT) is a plus. If you are passionate about building critical platforms, driving technical strategy, leading high-impact projects, and fostering a culture of technical excellence, we encourage you to apply for this position at CommerceIQ.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Snowflake Data Engineer at our organization, you will play a vital role in designing, developing, and maintaining our data infrastructure. Your responsibilities will include ingesting, transforming, and distributing data using Snowflake and AWS technologies. You will collaborate with various stakeholders to ensure efficient data pipelines and secure data operations. Your key responsibilities will involve designing and implementing data pipelines using Snowflake and AWS technologies. You will leverage tools like SnowSQL, Snowpipe, NiFi, Matillion, and DBT to ingest, transform, and automate data integration processes. Implementing role-based access controls and managing AWS resources will be crucial for ensuring data security and supporting Snowflake operations. Additionally, you will be responsible for optimizing Snowflake queries and data models for performance and scalability. To excel in this role, you should have a strong proficiency in SQL and Python, along with hands-on experience with Snowflake and AWS services. Understanding ETL/ELT tools, data warehousing concepts, and data quality techniques will be essential. Your analytical skills, problem-solving abilities, and excellent communication skills will enable you to collaborate effectively with data analysts, data scientists, and other team members. Preferred skills include experience with data virtualization, machine learning, AI concepts, data governance, and data security best practices. Staying updated with the latest advancements in Snowflake and AWS technologies will be essential for this role. If you are a passionate and experienced Snowflake Data Engineer with 5 to 7 years of experience, we invite you to apply and be a part of our team. This is a full-time position based in Gurgaon, with a hybrid work mode accommodating India, UK, and US work shifts.,
Posted 1 month ago
15.0 - 20.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Technical Program Manager specialized in Data Engineering & Analytics with 16-25 years of relevant experience, you will be responsible for leading large-scale and intricate initiatives related to Data, Business Intelligence (BI), and Artificial Intelligence/Machine Learning (AI/ML). Your role will involve managing complex programs, collaborating with clients, providing technical leadership, ensuring compliance with service level agreements and key performance indicators, and continuously improving processes for efficient service delivery. You will play a pivotal role in designing, developing, and implementing Data Engineering & Analytics solutions utilizing technologies such as Teradata, Google Cloud Data Platform (GCP), AI/ML, Qlik, and Tableau. Your responsibilities will include preparing operational and strategic reports, monitoring service performance, identifying areas for improvement, and proactively addressing issues and risks with appropriate mitigation plans. Moreover, you will be instrumental in fostering a culture of automation and innovation within the team to enhance service delivery performance. In addition to managing multi-functional teams and client relationships, you will also be involved in creating Statements of Work (SOW), proposals, solutions, and estimations for Data Analytics projects. Your expertise in E2E solution lifecycle management, modern data architecture, Cloud data modeling, SDLC, Agile methodologies, and team leadership will be essential for the success of Data Engineering and Management initiatives. Furthermore, your qualifications should include 15-20 years of experience in Data Warehousing, BI & Analytics, Data Management projects, with a focus on ETL, reporting, big data, and analytics. Experience in architecting, designing, and developing Data Engineering and Business Intelligence solutions, as well as working with data management tools like Data Quality, Metadata, Master Data, and Governance, will be highly valued. Additionally, experience in Cloud Data migration programs, value-driven innovation, and stakeholder management will be crucial for your role as a Technical Program Manager in Data Engineering & Analytics.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You should be a skilled and experienced Spark Scala Developer with a strong expertise in AWS cloud services and SQL to join our data engineering team. Your primary responsibility will be to design, build, and optimize scalable data processing systems that support our data platform. Your key responsibilities will include developing and maintaining large-scale distributed data processing pipelines using Apache Spark with Scala, working with AWS services (S3, EMR, Lambda, Glue, Redshift, etc.) to build and manage data solutions in the cloud, writing complex SQL queries for data extraction, transformation, and analysis, optimizing Spark jobs for performance and cost-efficiency, collaborating with data scientists, analysts, and other developers to understand data requirements, building and maintaining data lake and data warehouse solutions, implementing best practices in coding, testing, and deployment, and ensuring data quality and consistency across systems. To be successful in this role, you should have strong hands-on experience with Apache Spark (preferably using Scala), proficiency in the Scala programming language, solid experience with SQL (including complex joins, window functions, and performance tuning), working knowledge of AWS services like S3, EMR, Glue, Lambda, Athena, Redshift, experience in building and maintaining ETL/ELT pipelines, familiarity with data modeling and data warehousing concepts, experience with version control (e.g., Git) and CI/CD pipelines is a plus, and strong problem-solving and communication skills.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
Genpact is a global professional services and solutions firm that is committed to delivering outcomes that shape the future. With over 125,000 employees in 30+ countries, we are fueled by our innate curiosity, entrepreneurial agility, and the desire to create lasting value for our clients. Our purpose, the relentless pursuit of a world that works better for people, drives us to serve and transform leading enterprises, including the Fortune Global 500, by leveraging our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Principal Consultant - Snowflake Data Modeller. As a Data Engineer, you will be expected to demonstrate strong expertise in data analysis, data integration, data transformation, and ETL/ELT skills necessary to excel in this role. Additionally, relevant domain experience in Investment Banking and exposure to Cloud, preferably AWS, are desired qualifications. Responsibilities: - Possess hands-on experience in relational, dimensional, and/or analytic work using RDBMS, dimensional data platform technologies, and ETL and data ingestion. - Demonstrate experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts. - Exhibit strong communication and presentation skills. - Assist the team in implementing business and IT data requirements through new data strategies and designs across all data platforms and tools. - Collaborate with business and application/solution teams to implement data strategies and develop conceptual/logical/physical data models. - Define and enforce data modeling and design standards, tools, best practices, and related development for enterprise data models. - Engage in hands-on modeling and mappings between source system data model and Datawarehouse data models. - Proactively address project requirements and articulate issues/challenges to reduce project delivery risks regarding modeling and mappings. - Showcase hands-on experience in writing complex SQL queries. - Good to have experience in data modeling for NOSQL objects. Qualifications we seek in you: Minimum Qualifications: - Bachelor's Degree in Computer Science, Mathematics, or Statistics. - Relevant experience in the field. - 8+ years of experience in metadata management, data modeling, and related tools (Erwin, ER Studio, or others). Overall 10+ years of experience in IT. If you are passionate about leveraging your skills and experience in data modeling and analysis to drive impactful results, we invite you to join us as a Principal Consultant at Genpact. This is a full-time position based in India, Bangalore. Please note that the job posting date is October 7, 2024, and the unposting date is October 12, 2024. We are looking for individuals with a strong digital skill set to contribute to our dynamic team.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
You will be responsible for leading the design and architecture of highly scalable, robust, and efficient data solutions utilizing Snowflake as the primary data platform. This includes developing and executing enterprise-level data architecture strategies, blueprints, and roadmaps aligned with business objectives. You will architect and manage data solutions, optimizing performance for complex analytical workloads and high-volume reporting requirements. Collaboration with cross-functional teams such as analytics, engineering, and business stakeholders is essential to deeply understand business needs and translate them into well-defined, robust data architectures and technical specifications. In the realm of data engineering and implementation, your role will involve designing, implementing, and optimizing end-to-end data pipelines, data warehouses, and data lakes to ensure efficient, reliable, and automated data ingestion, transformation (ELT/ETL), and loading processes into Snowflake. You will be required to develop and maintain advanced SQL queries, stored procedures, and data models within Snowflake for complex data manipulation and analysis while ensuring data quality, consistency, and integrity across all data solutions. Regarding data governance and security, you will develop and enforce best practices in data governance, data security, access control, auditing, and compliance within the cloud-based data environment, specifically Snowflake. This will involve implementing data masking, encryption, and other security measures to safeguard sensitive data. Furthermore, you will evaluate, recommend, and integrate third-party tools, technologies, and services to enhance the Snowflake ecosystem, optimize data workflows, and support the overall data strategy. Staying informed about new Snowflake features, industry trends, and data technologies is crucial, allowing you to recommend their adoption where beneficial. Your role will also encompass providing technical leadership and mentorship to data engineers and other team members, fostering a culture of best practices and continuous improvement. Effective communication and collaboration across global teams will be facilitated by you. To be successful in this role, you should possess 5-7+ years of progressive experience in data architecture and data engineering roles. You must have proven, in-depth hands-on experience as a data architect with extensive knowledge of Snowflake's architecture, features, and best practices. Exceptional proficiency in advanced SQL, strong experience in designing and implementing scalable data warehouses and data lakes, hands-on experience with ETL/ELT processes, and familiarity with cloud data platforms are essential skills. Strong analytical, problem-solving, critical thinking skills, excellent communication abilities, and the capacity to work independently in a remote setting are also required. A Bachelor's degree in Computer Science, Engineering, Information Technology, or a related quantitative field is necessary, while a Master's degree is considered a plus. Candidates who can start immediately or with a short notice period are strongly preferred.,
Posted 1 month ago
12.0 - 16.0 years
0 Lacs
delhi
On-site
As a Data Architect in our organization, you will play a crucial role in defining the data architecture for key domains within the Data Products Portfolio. Your responsibilities will include evaluating data-related tools and technologies, recommending implementation patterns, and standard methodologies to ensure our Data ecosystem remains modern. Collaborating with Enterprise Data Architects, you will establish and adhere to enterprise standards while conducting PoCs to ensure their implementation. Your expertise will be instrumental in providing technical guidance and mentorship to Data Engineers and Data Analysts, developing and maintaining processes, standards, policies, guidelines, and governance to ensure consistency across the company. You will create and maintain conceptual/logical data models, work with business and IT teams to understand data requirements, and maintain a data dictionary with table and column definitions. Additionally, you will review data models with technical and business audiences and lead the design/build of new models to deliver financial results efficiently to senior management. This role is primarily technical, requiring you to function as an individual contributor (80%) while also demonstrating leadership capabilities (20%). Your key responsibilities include designing, documenting, and training the team on overall processes and process flows for Data architecture, resolving technical challenges in critical situations, developing relationships with external stakeholders, reviewing work from other tech team members, and implementing Data Architecture and Data security policies aligned with governance objectives and regulatory requirements. **Essential Education** - A Bachelor's degree in information science, data management, computer science, or a related field is preferred. **Experience & Qualifications** - Bachelor's degree or equivalent combination of education and experience. - 12+ years of IT experience with a major focus on data warehouse/database related projects. - Expertise in cloud databases like Snowflake/RedShift, data catalog, MDM, etc. - Proficiency in SQL, database procedures, Data Modelling (Conceptual, logical, and Physical), and documenting architecture-related work. - Hands-on experience in data storage, ETL/ELT, data analytics tools and technologies, Data Warehousing design/development, and BI/Analytical systems. - Experience with Cloud Big Data technologies such as AWS, Azure, GCP, and Snowflake. - Experience with Python is preferable. - Strong hands-on experience with data and analytics data architecture, solution design, and engineering. - Experience working with Agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams. - Strong communication and presentation skills for presenting architecture, features, and solution recommendations. You will work closely with global functional portfolio technical leaders, product owners, functional area teams, Global Data portfolio Management & teams, and consulting and internal Data Tribe teams across the organization.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a member of the Google Cloud Consulting Professional Services team, you will have the opportunity to contribute to the success of businesses by guiding them through their cloud journey and leveraging Google's global network, data centers, and software infrastructure. Your role will involve assisting customers in transforming their businesses by utilizing technology to connect with customers, employees, and partners. Your responsibilities will include interacting with stakeholders to understand customer requirements and providing recommendations for solution architectures. You will collaborate with technical leads and partners to lead migration and modernization projects to Google Cloud Platform (GCP). Additionally, you will design, build, and operationalize data storage and processing infrastructure using Cloud native products, ensuring data quality and governance procedures are in place to maintain accuracy and reliability. In this role, you will work on data migrations, modernization projects, and design data processing systems optimized for scaling. You will troubleshoot platform/product tests, understand data governance and security controls, and travel to customer sites to deploy solutions and conduct workshops to educate and empower customers. Furthermore, you will be responsible for translating project requirements into goals and objectives, creating work breakdown structures to manage internal and external stakeholders effectively. You will collaborate with Product Management and Product Engineering teams to drive excellence in products and contribute to the digital transformation of organizations across various industries. By joining this team, you will play a crucial role in shaping the future of businesses of all sizes and assisting them in leveraging Google Cloud to accelerate their digital transformation journey.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced Snowflake Data Engineer looking to support a Proof of Concept (POC) project and potentially scale it into a broader production environment. This is an exciting opportunity for you to contribute to shaping a modern data platform with real-time streaming capabilities using Snowflake's native features. Your key responsibilities will include leading or contributing to a POC for Snowflake-based data solutions, designing and developing real-time data ingestion pipelines using Snowpipe/Streams & Tasks, handling and processing streaming data within Snowflake, writing and optimizing complex SQL queries, stored procedures, and UDFs for data transformation, working with cloud platforms (AWS, Azure, GCP) to manage data ingestion and compute resources, collaborating with stakeholders to translate business needs into data solutions, proactively identifying improvements in data architecture, performance, and process automation, and contributing to building and mentoring a scalable data engineering team. You should have at least 4+ years of experience in data engineering, hands-on work with Snowflake development, performance tuning, and architecture, strong proficiency in SQL including analytical functions, CTEs, and complex logic, experience with Snowpipe, Streams, and Tasks, familiarity with streaming platforms like Kafka, Pulsar, or Confluent, experience with Python or Scala for scripting or data manipulation, comfort working in cloud-native environments (AWS, Azure, GCP), experience with ETL/ELT development, orchestration, and monitoring tools, and Git or version control experience. Preferred qualifications include prior experience delivering POC projects, knowledge of CI/CD practices for data pipeline deployment, understanding of data security, role-based access control, and governance in Snowflake, and strong communication skills with a collaborative mindset. If you are a self-starter who can take initiative, contribute ideas, and help seed a high-performing team, this role at Conglomerate IT could be the right fit for you.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
The Microsoft Cloud Data Engineer role is a great opportunity for a talented and motivated individual to design, construct, and manage cloud-based data solutions using Microsoft Azure technologies. Your primary responsibility will be to create strong, scalable, and secure data pipelines and support analytics workloads that drive business insights and data-based decision-making. You will design and deploy ETL/ELT pipelines using Azure Data Factory, Azure Synapse Analytics, Azure Databricks, and Azure Data Lake Storage. Additionally, you will be responsible for developing and overseeing data integration workflows to bring in data from various sources such as APIs, on-prem systems, and cloud services. It will also be important to optimize and maintain SQL-based data models, views, and stored procedures in Azure SQL, SQL MI, or Synapse SQL Pools. Collaboration with analysts, data scientists, and business teams will be crucial to gather data requirements and provide reliable and high-quality datasets. You will need to ensure data quality, governance, and security by implementing robust validation, monitoring, and encryption mechanisms. Supporting infrastructure automation using Azure DevOps, ARM templates, or Terraform for resource provisioning and deployment will also be part of your responsibilities. You will also play a role in troubleshooting, performance tuning, and the continuous improvement of the data platform. To qualify for this position, you should have a Bachelor's degree in Computer Science, Engineering, Information Systems, or a related field. A minimum of 3 years of experience in data engineering with a focus on Microsoft Azure data services is required. Hands-on experience with Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake is a must. Strong proficiency in SQL and data modeling is essential, along with experience in Python, PySpark, or .NET for data processing. Understanding of data warehousing, data lakes, and ETL/ELT best practices is important, as well as familiarity with DevOps tools and practices in an Azure environment. Knowledge of Power BI or similar visualization tools is also beneficial. Additionally, holding the Microsoft Certified: Azure Data Engineer Associate certification or its equivalent is preferred.,
Posted 1 month ago
1.0 - 5.0 years
0 Lacs
noida, uttar pradesh
On-site
Your journey at Crowe starts here with the opportunity to build a meaningful and rewarding career. At Crowe, you are trusted to deliver results and make an impact while having the flexibility to balance work with life moments. Your well-being is cared for, and your career is nurtured in an inclusive environment where everyone has equitable access to opportunities for growth and leadership. With over 80 years of history, Crowe has excelled in delivering excellent service through innovation across audit, tax, and consulting groups. As a Data Engineer at Crowe, you will provide critical integration infrastructure for analytical support and solution development for the broader Enterprise using market-leading tools and methodologies. Your expertise in API integration, pipelines or notebooks, programming languages (Python, Spark, T-SQL), dimensional modeling, and advanced data engineering techniques will be key in creating and delivering robust solutions and data products. You will be responsible for designing, developing, and maintaining the Enterprise Analytics Platform to support data-driven decision-making across the organization. Success in this role requires a strong interest and passion in data analytics, ETL/ELT best practices, critical thinking, problem-solving, as well as excellent interpersonal, communication, listening, and presentation skills. The Data team strives for an unparalleled client experience and will look to you to promote success and enhance the firm's image firmwide. To qualify for this role, you should have a Bachelor's degree in computer science, Data Analytics, Data/Information Science, Information Systems, Mathematics (or related fields), along with specific years of experience in SQL, data warehousing concepts, programming languages, managing projects, and utilizing tools like Microsoft Power BI, Delta Lake, or Apache Spark. It is preferred that you have hands-on experience or certification with Microsoft Fabric. Upholding Crowe's values of Care, Trust, Courage, and Stewardship is essential in this position, as we expect all team members to act ethically and with integrity at all times. Crowe offers a comprehensive benefits package to its employees and provides an inclusive culture that values diversity. You will have the opportunity to work with a Career Coach who will guide you in your career goals and aspirations. Crowe, a subsidiary of Crowe LLP (U.S.A.), a public accounting, consulting, and technology firm, is part of Crowe Global, one of the largest global accounting networks in the world. Crowe does not accept unsolicited candidates, referrals, or resumes from any staffing agency or third-party paid service. Referrals, resumes, or candidates submitted without a pre-existing agreement will be considered the property of Crowe.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data & Integration Architect at Techwave, you will play a key role in designing, implementing, and optimizing data management and integration frameworks. With over 12 years of experience in the field, you will be responsible for architecting scalable, secure, and high-performance solutions that align with the enterprise-wide data strategy and business objectives. Your primary responsibilities will include developing ETL/ELT pipelines, designing real-time and batch data integration solutions, defining data governance policies, and optimizing data storage and retrieval performance. You will collaborate closely with clients, business teams, and developers to translate business needs into technical solutions, lead data migration projects, and stay updated with the latest data technologies and best practices. To excel in this role, you should have a Bachelors or Masters degree in Computer Science or a related field, along with 7+ years of experience in data architecture, integration, and enterprise data management. You must possess expertise in data modeling, database design, and data warehousing, as well as strong knowledge of ETL/ELT tools, API design, and cloud data platforms such as AWS, Azure, and Google Cloud. Additionally, proficiency in big data technologies, event-driven architectures, data governance frameworks, containerization, orchestration, CI/CD pipelines, and problem-solving skills will be essential for success in this position. If you are a proactive problem solver with excellent analytical and communication skills, we invite you to join our team of dreamers and doers at Techwave and be part of our journey towards pushing the boundaries of what's possible. Apply now at https://techwave.net/join-us/ to explore this exciting opportunity.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Database Designer / Senior Data Engineer at VE3, you will be responsible for architecting and designing modern, scalable data platforms on AWS and/or Azure, ensuring best practices for security, cost optimization, and performance. You will develop detailed data models and document data dictionaries and lineage to support data solutions. Additionally, you will build and optimize ETL/ELT pipelines using languages such as Python, SQL, Scala, and services like AWS Glue, Azure Data Factory, and open-source frameworks like Spark and Airflow. Collaboration is key in this role as you will work closely with data analysts, BI teams, and stakeholders to translate business requirements into data solutions and dashboards. You will also partner with DevOps/Cloud Ops to automate CI/CD for data code and infrastructure, ensuring governance, security, and compliance standards such as GDPR and ISO27001 are met. Monitoring, alerting, and data quality frameworks will be implemented to maintain data integrity. As a mentor, you will guide junior engineers and stay updated on emerging big data and streaming technologies to enhance our toolset. The ideal candidate should have a Bachelor's degree in Computer Science, Engineering, IT, or similar field with at least 3 years of hands-on experience in a Database Designer / Data Engineer role within a cloud environment. Technical skills required include expertise in SQL, proficiency in Python or Scala, and familiarity with cloud services like AWS (Glue, S3, Kinesis, RDS) or Azure (Data Factory, Data Lake Storage, SQL Database). Strong communication skills are essential, along with an analytical mindset to address performance bottlenecks and scaling challenges. A collaborative attitude in agile/scrum settings is highly valued. Nice to have qualifications include certifications in AWS or Azure data analytics, exposure to data science workflows, experience with containerized workloads, and familiarity with DataOps practices and tools. At VE3, we are committed to fostering a diverse and inclusive environment where every voice is heard, and every idea can contribute to tomorrow's breakthrough.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
Apply Digital is a global digital transformation partner specializing in Business Transformation Strategy, Product Design & Development, Commerce, Platform Engineering, Data Intelligence, and Change Management. Our mission is to assist change agents in modernizing their organizations and delivering impactful results to their business and customers. Whether our clients are at the beginning, accelerating, or optimizing stage, we help them integrate composable technology as part of their digital transformation journey. Leveraging our extensive experience in developing intelligent products and utilizing AI tools, we drive value for our clients. Founded in 2016 in Vancouver, Canada, Apply Digital has expanded to nine cities across North America, South America, the UK, and Europe. We are excited to announce the launch of a new office in Delhi NCR, India, as part of our ongoing expansion. At Apply Digital, we advocate for a "One Team" approach, where we operate within a "pod" structure that combines senior leadership, subject matter experts, and cross-functional skill sets. This structure is supported by agile methodologies like scrum and sprint cadences, ensuring seamless collaboration and progress towards desired outcomes. Our team embodies our SHAPE values (smart, humble, active, positive, and excellent) to create a safe, empowered, respectful, and enjoyable work environment where everyone can connect, grow, and make a difference together. Apply Digital is a hybrid-friendly organization with remote work options available. The preferred candidate for this position should be based in or near the Delhi/NCR region of India, with working hours overlapping with the Eastern Standard Timezone (EST). The ideal candidate for this role will be responsible for designing, building, and maintaining scalable data pipelines and architectures to support analytical and operational workloads. Key responsibilities include optimizing ETL/ELT pipelines, integrating data pipelines into cloud-native applications, managing cloud data warehouses, ensuring data governance and security, collaborating with analytics teams, and maintaining data documentation. The candidate should have strong proficiency in English, experience in data engineering, expertise in SQL and Python, familiarity with cloud data platforms, and knowledge of ETL/ELT frameworks and workflow orchestration tools. Apply Digital offers a comprehensive benefits package that includes private healthcare coverage, Provident fund contributions, and a gratuity bonus after five years of service. We prioritize work-life balance with flexible personal time off policies and provide opportunities for skill development through training budgets, certifications, workshops, mentorship, and peer support. Apply Digital is committed to fostering an inclusive workplace where individual differences are celebrated, and equal opportunities are provided to all team members.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Technical Lead with over 8 years of experience in Data Engineering, Analytics, and Python development, including at least 3 years in a Technical Lead / Project Management role, you will play a crucial role in driving data engineering and analytics projects for our clients. Your client-facing skills will be essential in ensuring successful project delivery and effective communication between technical and business stakeholders. Your responsibilities will include designing and implementing secure, scalable data architectures on cloud platforms such as AWS, Azure, or GCP. You will lead the development of cloud-based data engineering solutions covering data ingestion, transformation, and storage while defining best practices for integrating diverse data sources securely. Overseeing security aspects of integrations and ensuring compliance with organizational and regulatory requirements will be part of your role. In addition, you will develop and manage robust ETL/ELT pipelines using Python, SQL, and modern orchestration tools, as well as integrate real-time streaming data using technologies like Apache Kafka, Spark Structured Streaming, or cloud-native services. Collaborating with data scientists to integrate AI models into production pipelines and cloud infrastructure will also be a key aspect of your responsibilities. Furthermore, you will work on advanced data analysis to generate actionable insights for business use cases, design intuitive Tableau dashboards and data visualizations, and define data quality checks and validation frameworks to ensure high-integrity data pipelines. Your expertise in REST API development, backend services, and integrating APIs securely will be crucial in developing and deploying data products and integrations. To excel in this role, you must have deep hands-on experience with cloud platforms, expertise in Python, SQL, Spark, Kafka, and streaming integration, proven ability with data warehousing solutions like BigQuery, Snowflake, and Redshift, and a strong understanding of integration security principles. Proficiency in data visualization with Tableau, REST API development, and AI/ML integration will also be essential. Preferred qualifications include prior experience managing enterprise-scale data engineering projects, familiarity with DevOps practices, and understanding of regulatory compliance requirements for data handling. Your ability to lead technical teams, ensure project delivery, and drive innovation in data engineering and analytics will be key to your success in this role.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As an experienced Software/Data Engineer with a passion for creating meaningful solutions, you will be joining a global team of innovators at a Siemens company. In this role, you will be responsible for developing data integration solutions using Java, Scala, and/or Python, with a focus on data and Business Intelligence (BI). Your primary responsibilities will include building data pipelines, data transformation, and data modeling to support various integration methods and information delivery techniques. To excel in this position, you should have a Bachelor's degree in an Engineering or Science discipline or equivalent experience, along with at least 5 years of software/data engineering experience. Additionally, you should have a minimum of 3 years of experience in a data and BI focused role. Proficiency in data integration development using languages such as Python, PySpark, and SparkSQL, as well as experience with relational databases and SQL optimization, are essential for this role. Experience with AWS-based data services technologies (e.g., Glue, RDS, Athena) and Snowflake CDW, along with familiarity with BI tools like PowerBI, will be beneficial. Your willingness to experiment with new technologies and adapt to agile development practices will be key to your success in this role. Join us in creating a brighter future where smarter infrastructure protects the environment and connects us all. Our culture is built on collaboration, support, and a commitment to helping each other grow both personally and professionally. If you are looking to make a positive impact and contribute to a more sustainable world, we invite you to explore how far your passion can take you with us.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a Senior Azure Cloud Data Engineer based in Bangalore, you will be instrumental in the processing and analysis of IoT data derived from our connected products. Your primary objective will be to provide valuable insights to both our internal teams and external clients. Your core responsibilities will include the development and maintenance of scalable and high-performance analytical data solutions leveraging Azure cloud technologies. With 3+ years of experience in Azure Analytics tools such as Data Factory, Synapse, and Event Hubs, you will possess strong proficiency in SQL, Python, and PySpark. Your expertise will extend to ETL/ELT processes, data streaming utilizing Kafka/Event Hubs, and handling unstructured data. A sound understanding of data modeling, data governance, and real-time processing will be crucial in this role. Apart from technical proficiencies, you will demonstrate soft skills such as a strong analytical and problem-solving mindset, exceptional verbal and written communication skills, and the ability to work effectively both independently and as part of a team. Your attention to detail, quality-focused approach, organizational abilities, and multitasking skills will be key to success in this role. Furthermore, your adaptability to a fast-paced and evolving environment, coupled with a self-motivated and proactive attitude, will be highly valued. If you are seeking a challenging opportunity to work in a dynamic environment that encourages innovation and collaboration, this position is ideal for you. Join our team and be part of a forward-thinking organization dedicated to leveraging cutting-edge technologies to drive impactful business outcomes.,
Posted 1 month ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
At PwC, our team in managed services specializes in providing outsourced solutions and supporting clients across various functions. We assist organizations in streamlining their operations, cutting costs, and enhancing efficiency by managing key processes and functions on their behalf. Our expertise in project management, technology, and process optimization enables us to deliver top-notch services to our clients. If you join our managed service management and strategy team at PwC, your focus will involve transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your responsibilities will include working on continuous improvement, optimizing managed services processes, tools, and services. As a Specialist in the Data Analytics & Insights Managed Service tower, you will work alongside a team of problem solvers to address complex business issues from strategy to execution using your skills in Data, Analytics, and Insights. Your role at this management level will require you to use feedback and reflection to enhance self-awareness, demonstrate critical thinking, and maintain high standards of quality in your work. You will also be responsible for reviewing ticket quality and deliverables, status reporting for projects, adherence to SLAs, incident management, change management, problem management, and more. To excel in this position, you should possess primary skills in ETL/ELT, SQL, Informatica, and Python, along with secondary skills in Azure/AWS/GCP, Talend, DataStage, etc. As a Data Engineer, you must have a minimum of 1 year of experience in Operate/Managed Services/Production Support. Your role will involve designing and implementing data pipelines, building ETL/ELT processes, monitoring data pipelines, ensuring data security and privacy, and optimizing schema and performance tuning. Additionally, you should have experience with ITIL processes, strong communication skills, problem-solving abilities, and analytical skills. Certifications in Cloud Technology and experience with visualization tools like Power BI, Tableau, Qlik, etc., are considered nice-to-have qualifications for this role. Our Managed Services in Data, Analytics & Insights focus on delivering integrated solutions that add value to our clients through technology and human-enabled experiences. By joining our team, you will be part of a group dedicated to empowering clients to optimize operations, accelerate outcomes, and drive transformational journeys. We prioritize a consultative approach to operations, leveraging industry insights and world-class talent to achieve sustained client outcomes. Our goal is to provide clients with flexible access to business and technology capabilities that align with the demands of the dynamic business environment.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
At PwC, our team in managed services specializes in providing outsourced solutions and supporting clients across various functions. We help organizations enhance their operations, reduce costs, and boost efficiency by managing key processes and functions on their behalf. Our expertise lies in project management, technology, and process optimization, allowing us to deliver high-quality services to our clients. In managed service management and strategy at PwC, the focus is on transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As a Managed Services - Data Engineer Senior Associate at PwC, you will be part of a team of problem solvers dedicated to addressing complex business issues from strategy to execution using Data, Analytics & Insights Skills. Your responsibilities will include using feedback and reflection to enhance self-awareness and personal strengths, acting as a subject matter expert in your chosen domain, mentoring junior resources, and conducting knowledge sharing sessions. You will be required to demonstrate critical thinking, ensure quality of deliverables, adhere to SLAs, and participate in incident, change, and problem management. Additionally, you will be expected to review your work and that of others for quality, accuracy, and relevance, as well as demonstrate leadership capabilities by working directly with clients and leading engagements. The primary skills required for this role include ETL/ELT, SQL, SSIS, SSMS, Informatica, and Python, with secondary skills in Azure/AWS/GCP, Power BI, Advanced Excel, and Excel Macro. As a Data Ingestion Senior Associate, you should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines, designing and implementing ETL processes, monitoring and troubleshooting data pipelines, implementing data security measures, and creating visually impactful dashboards for data reporting. You should also have expertise in writing and analyzing complex SQL queries, be proficient in Excel, and possess strong communication, problem-solving, quantitative, and analytical abilities. In our Managed Services platform, we focus on leveraging technology and human expertise to deliver simple yet powerful solutions to our clients. Our team of skilled professionals, combined with advanced technology and processes, enables us to provide effective outcomes and add greater value to our clients" enterprises. We aim to empower our clients to focus on their business priorities by providing flexible access to world-class business and technology capabilities that align with today's dynamic business environment. If you are a candidate who thrives in a high-paced work environment, capable of handling critical Application Evolution Service offerings, engagement support, and strategic advisory work, then we are looking for you to join our team in the Data, Analytics & Insights Managed Service at PwC. Your role will involve working on a mix of help desk support, enhancement and optimization projects, as well as strategic roadmap initiatives, while also contributing to customer engagements from both a technical and relationship perspective.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
delhi
On-site
We are looking for a highly motivated and enthusiastic Senior Data Scientist with 5-8 years of experience to join our dynamic team. The ideal candidate should have a strong background in AI/ML analytics and a passion for utilizing data to drive business insights and innovation. Your main responsibilities will include developing and implementing machine learning models and algorithms, collaborating with project stakeholders to understand requirements and deliverables, analyzing and interpreting complex data sets using statistical and machine learning techniques, staying updated with the latest advancements in AI/ML technologies, and supporting various AI/ML initiatives by working with cross-functional teams. To qualify for this role, you should have a Bachelor's degree in Computer Science, Data Science, or a related field, along with a strong understanding of machine learning, deep learning, and Generative AI concepts. Preferred skills for this position include experience in machine learning techniques such as Regression, Classification, Predictive modeling, Clustering, and Deep Learning stack using Python. Additionally, expertise in cloud infrastructure for AI/ML on AWS (Sagemaker, Quicksight, Athena, Glue), building secure data ingestion pipelines for unstructured data, proficiency in Python, TypeScript, NodeJS, ReactJS, data visualization tools, deep learning frameworks, version control systems, and Generative AI/LLM based development is desired. Good to have skills include knowledge and experience in building knowledge graphs in production and an understanding of multi-agent systems and their applications in complex problem-solving scenarios. Pentair is an Equal Opportunity Employer, valuing cross-cultural insight and competence for ongoing success, with a belief that a diverse workforce enhances perspectives and ideas for continuous improvement.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have at least 3+ years of experience in designing, developing, and administering Snowflake data warehouse solutions with a strong focus on scalability and performance. Your primary responsibilities will include writing and optimizing complex Snowflake SQL queries and scripts to ensure efficient data extraction, transformation, and loading (ETL/ELT). Additionally, you will be expected to develop and implement robust ETL/ELT pipelines using Snowflake and associated tools. Applying design patterns and best practices in data pipeline and system design will be crucial in this role. You will work extensively with cloud platforms, preferably Azure, to integrate Snowflake solutions. Tuning Snowflake warehouses for optimal query performance, including sizing, clustering, and partitioning strategies will also be part of your responsibilities. Collaboration with the DataOps Live platform to orchestrate, automate, and monitor data workflows and pipelines is essential. You will need to review and interpret design documents, including UML diagrams, to ensure alignment with technical solutions. Implementing data security measures such as masking policies, role-based access control, and compliance standards within Snowflake and Azure environments will be required. You should have experience utilizing version control systems like Git and participating in DevOps practices for continuous integration and deployment. Active engagement in Agile methodologies and effective collaboration with cross-functional teams will be expected. Clear and professional communication with clients and team members is necessary to ensure project alignment and success. About Virtusa: Virtusa values teamwork, quality of life, and professional and personal development. When you join Virtusa, you become part of a global team of 27,000 people who care about your growth. Virtusa aims to provide you with exciting projects, opportunities, and work with state-of-the-art technologies throughout your career with the company. At Virtusa, great minds and great potential come together. The company values collaboration and a team environment, seeking to provide dynamic opportunities for great minds to nurture new ideas and foster excellence.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
gujarat
On-site
We are searching for a skilled SQL and Data Integration Developer to join our team. As a SQL and Data Integration Developer, you will be responsible for designing, developing, and maintaining scalable SQL database solutions, including data warehouses and complex ETL/ELT processes. Your role will involve integrating third-party API data, optimizing database performance, and collaborating with cross-functional teams to support critical business applications. The ideal candidate for this position should have a strong background in T-SQL development, enterprise data architecture, and cloud-based tools like Azure Data Factory. You should possess excellent communication skills and a proactive approach to problem-solving in a distributed team environment. **Job Responsibilities:** - Design, develop, and maintain SQL database schemas and scripts, including views, stored procedures, and SQL jobs. - Develop logical and physical data models to ensure robust and scalable database design. - Monitor and maintain existing data to ensure cleanliness, accuracy, consistency, and impact. - Write and optimize complex stored procedures, functions, and views using T-SQL for high-performance data operations. - Build and maintain data integration workflows using ETL/ELT tools like Azure Data Factory to facilitate data movement between systems and environments. - Integrate and import data from third-party APIs into SQL databases, ensuring data accuracy, security, and consistency. - Collaborate closely with application developers to implement and optimize database structures that meet application requirements. - Design and implement scalable data warehouse solutions that align with business goals and support enterprise analytics. - Act as the primary liaison between the SQL development team and cross-functional teams, including marketing, accounting, graphic design, and customer support. - Create comprehensive technical documentation, including design specifications, architecture documentation, and user instructions. - Continuously evaluate existing software components and tools, recommending improvements to ensure efficiency and scalability. - Participate in an agile development environment, coordinating with a distributed team across multiple time zones to deliver high-quality solutions on time and within budget. **Job Requirements:** - Work From Home (Shift: 2 pm - 11 pm) based in India. - 5+ years of hands-on experience in SQL development. - 3+ years of experience working with ETL/ELT tools, preferably Azure Data Factory. - Proficiency in writing and optimizing complex T-SQL queries, with expertise in performance tuning and query development. - Proven track record in developing and maintaining enterprise data warehouse architectures. - Experience in integrating data from external APIs and importing third-party data into relational databases. - Familiarity with version control systems such as TFS, GIT, or Azure DevOps. - Exposure to Azure SQL, SQL Source Control, and Azure Logic Apps is a plus. - Knowledge of Master Data Management (MDM) concepts and practices is advantageous. - Solid understanding of database troubleshooting and implementation of industry best practices. - Bachelor's degree (or higher) in Computer Science, Information Systems, or a related field. - Excellent communication and collaboration skills, with a proven ability to work effectively with globally distributed teams. **Benefits:** - Group Mediclaim Policy - Parental Insurance Coverage - Accident Policy - Retirement Benefits (Provident Fund) - Gratuity - Overtime Bonus, Paid Vacation & Holidays, Profit Sharing & Incentives,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
62336 Jobs | Dublin
Wipro
24848 Jobs | Bengaluru
Accenture in India
20859 Jobs | Dublin 2
EY
18920 Jobs | London
Uplers
13736 Jobs | Ahmedabad
IBM
12924 Jobs | Armonk
Bajaj Finserv
12820 Jobs |
Accenture services Pvt Ltd
11998 Jobs |
Amazon
11950 Jobs | Seattle,WA
Oracle
11422 Jobs | Redwood City