Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 years
4 - 9 Lacs
hyderābād
On-site
DESCRIPTION The candidate would be responsible for maintaining/refreshing WBRs and other analytical frameworks setup by senior analysts. They would also be required to build simple reports, take up dive deep requests, make changes to existing analytical frameworks and provide adhoc data support to Ops stakeholders. The person should have a good understanding of a business requirement and the ability to quickly get to the root cause of a particular reporting/BI/data issue, and draft solutions for resolution. The ideal candidate would be high on attention to detail, bias for action and interest in analytics/BI/automation. Some of the key result areas include, but not limited to: Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Ensure data accuracy by validating data for new and resources. Work closely with stakeholders (internal/external) to understand and automate/enhance existing processes Should be open to learn and develop skillsets in the latest technologies and analytical techniques Should understand how data/analytical frameworks and their work translate to business on ground Should be able to come up with innovative ideas for new work or to improve existing work BASIC QUALIFICATIONS 1+ years of data analytics or automation experience Bachelor's degree Knowledge of data pipelining and extraction using SQL Knowledge of SQL and Excel at a moderate or advanced level Knowledge of SQL/Python/R, scripting, MS Excel, table joins, and aggregate analytical functions Expertize with visualization tools such as Quicksight, Tableau or Power BI PREFERRED QUALIFICATIONS Experience in Linux and AWS Services Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 3 days ago
3.0 years
0 Lacs
hyderābād
On-site
Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: SE / Quality Engineer – Junior – Data Testing Position: Software Engineer Experience: 3+ Years Category: Testing / Quality Assurance Main Location: Hyderabad Position ID: J0725-0099 Employment Type: Full Time We are looking for an experienced Data Testing to join our team. The ideal candidate should be passionate about coding and developing scalable and high-performance applications. You will work closely with our front-end developers, designers, and other members of the team to deliver quality solutions that meet the needs of our clients. Qualification: Bachelor's degree in Computer Science or related field or higher with minimum 3 years of relevant experience. Your future duties and responsibilities Responsibilities: Execute Big Data, ETL, and DWH testing with RDBMS and/or Hadoop reporting systems. Write and validate SQL queries for complex transformations (Joins, Functions, CTEs). Apply Python/PySpark Data Testing approaches and frameworks. Execute test solutions for Table and File ingestion ETL processes. Provide support for Real-Time streaming application testing (a plus). Perform defect lifecycle management and track issues accurately. Develop and maintain test artifacts throughout the testing lifecycle. Collaborate with Agile teams using JIRA/XRAY or similar project/test management tools. Must-Have Skills: 3+ years of experience in Big Data / ETL / DWH Testing with RDBMS and/or Hadoop reporting. Strong SQL skills to validate complex transformations (Joins, Functions, CTEs). 1+ years of hands-on experience with Python/PySpark Data Testing frameworks. Experience with defect management and testing lifecycle. Good exposure to Agile testing processes. Proficiency with JIRA/XRAY or similar Agile project/test management tools. Required qualifications to be successful in this role Good-to-Have Skills: Exposure to Real-Time streaming applications. Experience with automation in Data Testing. Knowledge of cloud-based data platforms (AWS, Azure, GCP). Strong problem-solving and analytical skills. Ability to work effectively in cross-functional Agile team CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodations for people with disabilities in accordance with provincial legislation. Please let us know if you require a reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 3 days ago
0 years
0 Lacs
andhra pradesh
On-site
Primary Skills Experience in Risk or finance technology area Strong SQL knowledge, involving complex joins and analytical functions Good understanding of Data Flow , Data Model and database applications working knowledge of Databases like Oracle and Netezza Decompose Existing Essbase Cubes(Understand cube structure and logic. Document all metadata and business rules. Prepare for migration or rebuild on modern platforms) Secondary Skills Conceptual knowledge of ETL and data warehousing, working knowledge is added advantage Basic knowledge of Java is added advantage. JD Seeking professionals with capability to perform thorough analysis and articulation of risk or Finance Tech model data requirements, identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools .He will provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on this basis .He works with large amounts of data: facts, figures, and number crunching. Perform thorough analysis and articulation of risk or Finance Tech model data requirements, Identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools. Provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on external regulations (credit risk) Need to work with large amounts of data: facts, figures, and number crunching. Experience in Risk or finance technology area. Strong SQL knowledge, involving complex joins and analytical functions About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 3 days ago
12.0 years
0 Lacs
hyderabad, telangana, india
On-site
We are also seeking 3 experienced Report Developers with expertise in Oracle Procure-to-Pay (P2P) reporting. Experience Level: 8–12 years Key responsibilities include: · Creating and maintaining operational and analytical reports · Supporting procurement, supplier management, invoicing, and payment processes · Strong hands-on expertise in Oracle P2P modules and reporting tools (OTBI, BI Publisher, Smart View). · Deep understanding of procure-to-pay lifecycle: requisition → purchase order → receipt → invoice → payment. · Ability to build compliance and audit-ready reports (e.g., 3-way match reports, vendor compliance reports). · Strong knowledge of supplier data, invoice matching, and AP workflows. · Familiarity with key integrations between Procurement, Accounts Payable (AP), and General Ledger (GL). · Proficiency in SQL, joins, and data extraction for reporting. · Experience with KPIs and metrics such as days payable outstanding, procurement cycle time, and supplier aging. · Ability to troubleshoot reporting issues and reconcile discrepancies independently.
Posted 3 days ago
3.0 - 5.0 years
0 Lacs
gurugram, haryana, india
On-site
Job Description: Business Analyst Location: DLF Office, India Department: Strategy & Business Analysis TBO is a global platform that aims to simplify all buying and selling travel needs of travel partners across the world. The proprietary technology platform aims to simplify the demands of the complex world of global travel by seamlessly connecting the highly distributed travel buyers and travel suppliers at scale. The TBO journey began in 2006 with a simple goal – to address the evolving needs of travel buyers and suppliers, and what started off as a single product air ticketing company, has today become the leading B2A (Business to Agents) travel portal across the Americas, UK & Europe, Africa, Middle East, India, and Asia Pacific. Today, TBO’s product range from air, hotels, rail, holiday packages, car rentals, transfers, sightseeing, cruise, and cargo. Apart from these products, our proprietary platform relies heavily on AI/ML to offer unique listings and products, meeting specific requirements put forth by customers, thus increasing conversions. TBO’s approach has always been technology-first and we continue to invest on new innovations and new offerings to make travel easy and simple. TBO’s travel APIs are serving large travel ecosystems across the world while the modular architecture of the platform enables new travel products while expanding across new geographies. Why TBO: • You will influence & contribute to “Building World Largest Technology Led Travel Distribution Network” for a $ 9 Trillion global travel business market. • We are the emerging leaders in technology led end-to-end travel management, in the B2B space. • Physical Presence in 47 countries with business in 110 countries. • We are reputed for our-long lasting trusted relationships. We stand by our eco system of suppliers and buyers to service the end customer. • An open & informal start-up environment which cares. What TBO offers to a Life Traveler in You: • Enhance Your Leadership Acumen. Join the journey to create global scale and ‘World Best’. • Challenge Yourself to do something path breaking. Be Empowered. The only thing to stop you will be your imagination. • Post pandemic: travel space is likely to see significant growth. Witness and shape this space. It will be one exciting journey. • As a fastest growing B2B platform our priority is purpose-building scalable systems. • Adopting industry leading technologies to support best-in-class business capabilities for high performing and scalable solutions. • Fast response to the evolving regulatory environment and helping to meet the firm's regulatory commitments by addressing internal and external commitments. About the Role: We are looking for a skilled Business Analyst with strong analytical skills and hands-on experience in SQL and MS Excel to join our team. The ideal candidate will bridge the gap between business needs and technical solutions, analyze data to identify trends, support decision-making, and work closely with stakeholders to improve business processes. Key Responsibilities: Gather, analyze, and document business requirements by working closely with stakeholders across functions. Conduct data analysis using SQL queries to extract, manipulate, and validate data from relational databases. Develop and maintain complex MS Excel reports (including Pivot Tables, VLOOKUPs, Macros, etc.) to track key business metrics and provide actionable insights. Work with cross-functional teams including product, operations, and IT to translate business needs into technical specifications. Perform gap analysis and suggest improvements in business processes. Support ad-hoc data analysis and reporting requests. Create dashboards and data visualizations to present findings and support decision-making. Ensure data accuracy and integrity by validating data sources and performing data cleansing as needed. Communicate findings and recommendations clearly to both technical and non-technical audiences. Required Skills & Experience: 3 to 5 years of experience as a Business Analyst or Data Analyst. Strong proficiency in SQL (writing complex queries, joins, subqueries, data aggregation). Advanced proficiency in MS Excel (Pivot Tables, Macros, formulas, data visualization). Experience in requirement gathering, process mapping, and gap analysis. Strong analytical and problem-solving skills. Good understanding of data visualization tools (e.g., Power BI, Tableau) is a plus. Ability to work independently and manage multiple tasks/priorities. Strong communication and stakeholder management skills. Basic understanding of databases, data structures, and data modeling. Educational Qualification: Bachelor’s degree in Business Administration, Computer Science, Information Systems, Statistics, or related field .
Posted 3 days ago
1.0 years
0 Lacs
indore, madhya pradesh, india
On-site
Company Description: E-vitamin is an e-commerce and digital marketing service provider based in Indore. We offer a wide range of services including Social Media Marketing (SMM), Pay-Per-Click (PPC), A+ Content, Cataloguing, Web Design and Development, Digital Marketing, E-commerce Solutions, Content Management, and Advertising projects. Our mission is to provide comprehensive solutions for all your online business requirements. Job Description: We are seeking an MIS Executive / Analyst with strong expertise in Advanced Excel, Power BI, and SQL . The ideal candidate should be detail-oriented, analytical, and capable of transforming raw data into meaningful business insights to support decision-making. Key Responsibilities: Prepare, maintain, and analyze MIS reports on a regular basis. Develop and manage dashboards and data visualizations using Power BI . Work extensively with Advanced Excel (VLOOKUP, HLOOKUP, Pivot Tables, Macros, etc.). Write, optimize, and execute SQL queries for data extraction and reporting. Analyze business performance trends and prepare actionable insights for management. Ensure accuracy, consistency, and integrity of data across all reports. Collaborate with cross-functional teams to understand reporting requirements. Required Skills and Experience: Advanced Excel (VLOOKUP, HLOOKUP, Pivot Tables – Mandatory). Hands-on experience in Power BI (Report/Dashboard creation). Strong working knowledge of SQL (data queries, joins, stored procedures). Experience in data analysis, data validation, and visualization. Strong analytical and problem-solving skills. Good communication and presentation skills. Qualification: Graduate/Post Graduate in B.Com, BBA, B.Sc, BCA, MBA, MCA, or related field . 1+ years of proven experience in MIS/Data Analysis/Business Intelligence . Perks & Benefits: Opportunity to work with a growing digital consulting firm. Exposure to diverse projects and business verticals. Professional growth and learning opportunities.
Posted 3 days ago
5.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Job Summary We are seeking a highly skilled and detail-oriented Senior SQL Data Analyst to join our data-driven team. This role will be responsible for leveraging advanced SQL skills to extract, analyze, and interpret complex datasets, delivering actionable insights to support business decisions. You will work closely with cross-functional teams to identify trends, solve problems, and drive data-informed strategies across the organization. Key Responsibilities Develop, write, and optimize advanced SQL queries to retrieve and analyze data from multiple sources. Design and maintain complex data models, dashboards, and reports. Collaborate with stakeholders to understand business needs and translate them into analytical requirements. Conduct deep-dive analysis to identify key business trends and opportunities for growth or improvement. Ensure data integrity and accuracy across systems and reporting tools. Automate recurring reports and develop scalable data pipelines. Present findings in a clear, compelling way to both technical and non-technical audiences. Qualifications Required: Bachelor's degree in Computer Science, Information Systems, Mathematics, Statistics, or related field. 5+ years of experience in data analysis or a similar role with a strong focus on SQL. Expert proficiency in SQL (window functions, joins, CTEs, indexing, etc.). Strong understanding of data warehousing concepts and relational database systems (e.g., PostgreSQL, SQL Server, Snowflake, Redshift). Experience with BI tools like Tableau, Power BI, or Looker. Excellent analytical, problem-solving, and communication skills. Preferred Experience with scripting languages (Python, R) for data manipulation. Familiarity with cloud data platforms (AWS, Azure). Knowledge of ETL tools and best practices. Previous experience in a fast-paced, agile environment.
Posted 3 days ago
1.0 - 3.0 years
0 Lacs
india
Remote
Experience Required: Fresher to 1-3 years Location: Remote Employment Type: Full-time Key Responsibilities · Develop, maintain, and optimize web applications using Django framework. · Build and maintain RESTful APIs for internal and external use. Work with SQL (e.g., PostgreSQL/MySQL) and NoSQL (e.g., MongoDB) databases at a basic level. · Contribute to the design and implementation of microservices-based architectures. Collaborate with senior developers, QA engineers, and product managers to deliver high-quality features. · Write clean, maintainable, and efficient code following best practices. · Participate in code reviews, testing, and debugging to ensure application reliability. Required Skills & Qualifications · Solid foundation in Python and Django framework. · Understanding of REST API development and JSON data handling. · Basic knowledge of SQL (queries, joins, schema design). · Familiarity with NoSQL databases (MongoDB, Redis, or similar). · Exposure to or understanding of microservices architecture. · Familiarity with version control systems (Git). · Problem-solving attitude with eagerness to learn and grow. Nice-to-Have (Optional) · Experience with Docker or containerized environments. · Basic knowledge of cloud platforms (AWS, GCP, Azure). · Familiarity with CI/CD pipelines. What We Offer: · Opportunity to work on high-impact, meaningful projects. A fast-paced, collaborative environment with a strong technical team. · Learning and growth opportunities with emerging tools and technologies. · Flexible work arrangements and competitive compensation. To Apply: Send your updated resume, GitHub/portfolio links (if any), and a short note about your experience to careers@codesis.io with the subject line Application – Junior Django or apply directly through our LinkedIn job posting.
Posted 3 days ago
0.0 - 5.0 years
0 Lacs
maharashtra
On-site
Job Information Job Opening ID OTSI_2284_JOB Date Opened 09/12/2025 Industry IT Services Job Type Full time Required Skills .Net .net core +3 City NA State/Province Maharashtra Country India Zip/Postal Code 400071 About Us OTSI is a leading global technology company offering solutions, consulting, and managed services for businesses worldwide since 1999. OTSI serves clients from its 15 offices across 6 countries around the globe with a “Follow-the-Sun” model. Headquartered in Overland Park, Kansas, we have a strong presence in North America, Central America, and Asia-Pacific with a Global Delivery Center based in India. These strategic locations offer our customers the competitive advantages of onshore, nearshore, and offshore engagement and delivery options, with 24/7 support. OTSI works with 100+ enterprise customers, of which many are Fortune ranked, OTSI focuses on industry segments such as Banking, Financial Services & Insurance, Healthcare & Life Sciences, Energy & Utilities, Communications & Media Entertainment, Engineering & Telecom, Retail & Consumer Services, Hi-tech, Manufacturing, Engineering, transport logistics, Government, Defense & PSUs. Our focused technologies are: Data & Analytics (Traditional EDW, BI, Big data, Data Engineering, Data Management, Data Modernization, Data Insights) Digital Transformation (Cloud Computing, Mobility, Micro Services, RPA, DevOps) QA & Automation (Manual Testing, Nonfunctional testing, Test Automation, Digital Testing) Enterprise Applications (SAP, Java Full stack, Microsoft, Custom Development) Disruptive Technologies (Edge Computing/IOT, Block Chain, AR/VR, Biometric) Job Description Object Technology Solutions, Inc (OTSI) has an immediate opening for a .Net Full-Stack Developer .Net Full-Stack Developer Job Location: Mumbai, Chembur MAJOR RESPONSIBILITIES: Full Stack Developer with practical experience in both backend and frontend development. Hands on experience in the .NET MVC framework and ASP.NET. Skilled in scripting languages such as JavaScript and jQuery. Experienced in writing complex SQL queries, including joins, stored procedures, triggers, functions, and cursors. Knowledgeable in HTML, CSS, and Bootstrap for responsive web design. Familiar with developing RESTful Web APIs and performing CRUD operations. Additional expertise in Power BI and SSRS report generation. The resources placed at respective work locations should be punctual and regular in attending the office. development requirements would vary during different phases; hence exact requirements would vary from time to time. The Developers will understand the functional requirements and undertake application development as per specifications given by the project leader. The developers will carry out coding in the platform identified carry out unit testing and interact with team members for implementing and rolling out the solution. They will adhere to standards laid down by for development, inline documentation, testing, etc. Create and maintain proper technical documentation of all developments Knowledge transfer to in-house Development team along with documentation. The source code developed by the developers will be the property of . Should be available on Sundays/ Holidays as per requirement on a need basis. QUALIFICATIONS AND EXPERIENCE B. E. / B. Tech. / M.Sc. in Computer Science or Computer Engineering or Information Technology or MCA with minimum passing marks of 60%. 5 years of relevant experience in .Net
Posted 3 days ago
3.0 years
0 Lacs
hyderabad, telangana, india
On-site
Position Description Company Profile: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: SE / Quality Engineer – Junior – Data Testing Position: Software Engineer Experience: 3+ Years Category: Testing / Quality Assurance Main Location: Hyderabad Position ID: J0725-0099 Employment Type: Full Time We are looking for an experienced Data Testing to join our team. The ideal candidate should be passionate about coding and developing scalable and high-performance applications. You will work closely with our front-end developers, designers, and other members of the team to deliver quality solutions that meet the needs of our clients. Qualification: Bachelor's degree in Computer Science or related field or higher with minimum 3 years of relevant experience. Your future duties and responsibilities Responsibilities: Execute Big Data, ETL, and DWH testing with RDBMS and/or Hadoop reporting systems. Write and validate SQL queries for complex transformations (Joins, Functions, CTEs). Apply Python/PySpark Data Testing approaches and frameworks. Execute test solutions for Table and File ingestion ETL processes. Provide support for Real-Time streaming application testing (a plus). Perform defect lifecycle management and track issues accurately. Develop and maintain test artifacts throughout the testing lifecycle. Collaborate with Agile teams using JIRA/XRAY or similar project/test management tools. Must-Have Skills: 3+ years of experience in Big Data / ETL / DWH Testing with RDBMS and/or Hadoop reporting. Strong SQL skills to validate complex transformations (Joins, Functions, CTEs). 1+ years of hands-on experience with Python/PySpark Data Testing frameworks. Experience with defect management and testing lifecycle. Good exposure to Agile testing processes. Proficiency with JIRA/XRAY or similar Agile project/test management tools. Required Qualifications To Be Successful In This Role Good-to-Have Skills: Exposure to Real-Time streaming applications. Experience with automation in Data Testing. Knowledge of cloud-based data platforms (AWS, Azure, GCP). Strong problem-solving and analytical skills. Ability to work effectively in cross-functional Agile team CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodations for people with disabilities in accordance with provincial legislation. Please let us know if you require a reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 3 days ago
8.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Description Data engineer with 8+ years of hands on experience working on Big Data Platforms Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing Good experience in any one programming language -Scala/Python , Python preferred. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc Experience in using Kafka or any other message brokers Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data Should have experience with any one No SQL databases like Amazon S3 etc Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc Work expereince on any one cloud AWS or GCP or Azure Requirements Data engineer with 8+ years of hands – on experience (Principle Engineer I ) working on Big Data Platforms Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing Good experience in any one programming language -Scala/Python , Python preferred. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc Experience in using Kafka or any other message brokers Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data Should have experience with any one No SQL databases like Amazon S3 etc Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc Work expereince on any one cloud AWS or GCP or Azure Good to have skills: Experience in AWS cloud services like EMR, S3, Redshift, EKS/ECS etc Experience in GCP cloud services like Dataproc, Google storage etc Experience in working with huge Big data clusters with millions of records Experience in working with ELK stack, specially Elasticsearch Experience in Iceberg, Hadoop MapReduce, Apache Flink, Kubernetes etc Job responsibilities Data engineer with 8+ years of hands on experience working on Big Data Platforms Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing Good experience in any one programming language -Scala/Python , Python preferred. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc Experience in using Kafka or any other message brokers Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data Should have experience with any one No SQL databases like Amazon S3 etc Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc Work expereince on any one cloud AWS or GCP or Azure Good to have skills: Experience in AWS cloud services like EMR, S3, Redshift, EKS/ECS etc Experience in GCP cloud services like Dataproc, Google storage etc Experience in working with huge Big data clusters with millions of records Experience in working with ELK stack, specially Elasticsearch Experience in Hadoop MapReduce, Apache Flink, Kubernetes etc What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 3 days ago
8.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Description Data engineer with 8+ years of hands on experience working on Big Data Platforms Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing Good experience in any one programming language -Scala/Python , Python preferred. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc Experience in using Kafka or any other message brokers Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data Should have experience with any one No SQL databases like Amazon S3 etc Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc Work expereince on any one cloud AWS or GCP or Azure Requirements Data engineer with 8+ years of hands – on experience (Principle Engineer I ) working on Big Data Platforms Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing Good experience in any one programming language -Scala/Python , Python preferred. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc Experience in using Kafka or any other message brokers Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data Should have experience with any one No SQL databases like Amazon S3 etc Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc Work expereince on any one cloud AWS or GCP or Azure Good to have skills: Experience in AWS cloud services like EMR, S3, Redshift, EKS/ECS etc Experience in GCP cloud services like Dataproc, Google storage etc Experience in working with huge Big data clusters with millions of records Experience in working with ELK stack, specially Elasticsearch Experience in Iceberg, Hadoop MapReduce, Apache Flink, Kubernetes etc Job responsibilities Data engineer with 8+ years of hands on experience working on Big Data Platforms Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing Good experience in any one programming language -Scala/Python , Python preferred. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc Experience in using Kafka or any other message brokers Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data Should have experience with any one No SQL databases like Amazon S3 etc Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc Work expereince on any one cloud AWS or GCP or Azure Good to have skills: Experience in AWS cloud services like EMR, S3, Redshift, EKS/ECS etc Experience in GCP cloud services like Dataproc, Google storage etc Experience in working with huge Big data clusters with millions of records Experience in working with ELK stack, specially Elasticsearch Experience in Hadoop MapReduce, Apache Flink, Kubernetes etc What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 3 days ago
8.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Description Data engineer with 8+ years of hands on experience working on Big Data Platforms Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing Good experience in any one programming language -Scala/Python , Python preferred. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc Experience in using Kafka or any other message brokers Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data Should have experience with any one No SQL databases like Amazon S3 etc Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc Work expereince on any one cloud AWS or GCP or Azure Requirements Data engineer with 8+ years of hands – on experience (Principle Engineer I )working on Big Data Platforms Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing Good experience in any one programming language -Scala/Python , Python preferred. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc Experience in using Kafka or any other message brokers Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data Should have experience with any one No SQL databases like Amazon S3 etc Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc Work expereince on any one cloud AWS or GCP or Azure Good to have skills: Experience in AWS cloud services like EMR, S3, Redshift, EKS/ECS etc Experience in GCP cloud services like Dataproc, Google storage etc Experience in working with huge Big data clusters with millions of records Experience in working with ELK stack, specially Elasticsearch Experience in Iceberg, Hadoop MapReduce, Apache Flink, Kubernetes etc Job responsibilities Data engineer with 8+ years of hands on experience working on Big Data Platforms Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing Good experience in any one programming language -Scala/Python , Python preferred. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc Experience in using Kafka or any other message brokers Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data Should have experience with any one No SQL databases like Amazon S3 etc Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc Work expereince on any one cloud AWS or GCP or Azure Good to have skills: Experience in AWS cloud services like EMR, S3, Redshift, EKS/ECS etc Experience in GCP cloud services like Dataproc, Google storage etc Experience in working with huge Big data clusters with millions of records Experience in working with ELK stack, specially Elasticsearch Experience in Hadoop MapReduce, Apache Flink, Kubernetes etc What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 3 days ago
4.0 - 6.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Description Data engineer with 4 to 6 years of hands on experience working on Big Data Platforms Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing Good experience in any one programming language -Scala/Python , Python preferred. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc Experience in using Kafka or any other message brokers Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data Should have experience with any one No SQL databases like Amazon S3 etc Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc Work expereince on any one cloud AWS or GCP or Azure Requirements Data engineer with 4 to 6 years of hands on experience working on Big Data Platforms Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing Good experience in any one programming language -Scala/Python , Python preferred. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc Experience in using Kafka or any other message brokers Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data Should have experience with any one No SQL databases like Amazon S3 etc Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc Work expereince on any one cloud AWS or GCP or Azure Good to have skills: Experience in AWS cloud services like EMR, S3, Redshift, EKS/ECS etc Experience in GCP cloud services like Dataproc, Google storage etc Experience in working with huge Big data clusters with millions of records Experience in working with ELK stack, specially Elasticsearch Experience in Hadoop MapReduce, Apache Flink, Kubernetes etc Job responsibilities Data engineer with 4 to 6 years of hands on experience working on Big Data Platforms Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing Good experience in any one programming language -Scala/Python , Python preferred. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc Experience in using Kafka or any other message brokers Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data Should have experience with any one No SQL databases like Amazon S3 etc Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc Work expereince on any one cloud AWS or GCP or Azure Good to have skills: Experience in AWS cloud services like EMR, S3, Redshift, EKS/ECS etc Experience in GCP cloud services like Dataproc, Google storage etc Experience in working with huge Big data clusters with millions of records Experience in working with ELK stack, specially Elasticsearch Experience in Hadoop MapReduce, Apache Flink, Kubernetes etc What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 3 days ago
5.0 years
0 Lacs
bengaluru, karnataka, india
On-site
At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Data – Associate (2–5 Years) Our Analytics & Insights Managed Services team brings a unique combination of industry expertise, technology, data management and managed‑services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Have a voice at our table to help design, build, and operate the next generation of data and analytics solutions as an Associate. Basic Qualifications Job Requirements and Preferences Minimum Degree Required: Bachelor’s Degree in Engineering, Statistics, Mathematics, Computer Science, Data Science, Economics, or a related quantitative field Minimum Years of Experience: 2–5 years of professional experience in analytics, data science, or business intelligence roles Preferred Qualifications Degree Preferred: Master’s Degree in Engineering, Statistics, Data Science, Business Analytics, Economics, or related discipline Preferred Fields of Study: Data Analytics/Science, Statistics, Management Information Systems, Economics, Computer Science Preferred Knowledge & Skills As an Associate , you’ll design, develop, and support BI reporting solutions using SSRS and Power BI, ensuring data-driven insights are accurate, timely, and aligned with business needs. You will work under the guidance of senior team members, collaborating with analysts, data engineers, and business stakeholders: SSRS Report Development – Develop, enhance, and maintain paginated reports and dashboards using SQL Server Reporting Services (SSRS) – Write efficient SQL queries, stored procedures, and datasets to power reports – Apply best practices in report design, parameterization, and subscription scheduling – Assist in troubleshooting performance issues related to report rendering and query execution Power BI Dashboard Development – Build interactive dashboards and self-service BI solutions in Power BI – Develop DAX measures, calculated columns, and data models to support analytics needs – Collaborate with business teams to translate KPIs into visually impactful dashboards – Assist in publishing, managing workspaces, and configuring row-level security SQL Querying & Data Preparation – Write and optimize SQL queries to extract, transform, and validate data for reporting – Perform joins, aggregations, window functions, and filtering to prepare datasets – Ensure data consistency and accuracy between SSRS and Power BI outputs Data Modeling & Integration – Support dimensional modeling (star/snowflake schema) for reporting solutions – Assist in integrating multiple data sources (SQL Server, Excel, flat files, cloud sources) into Power BI models – Collaborate with ETL teams to streamline reporting datasets Data Quality & Validation Support – Perform validations of source-to-report data, ensuring accuracy and consistency – Document business rules, metrics definitions, and report logic for traceability Cloud & Hybrid Platform Exposure – Gain experience with Power BI Service (cloud deployment, gateways, scheduled refresh) – Assist in configuring SSRS/Power BI solutions in hybrid cloud + on-prem environments Collaboration & Agile Delivery – Participate in agile ceremonies (daily scrums, sprint planning, reviews) – Provide timely updates, raise blockers, and work collaboratively with BI developers, analysts, and data engineers Documentation & Continuous Learning – Maintain technical documentation for reports, dashboards, and queries – Proactively upskill in advanced Power BI (composite models, performance tuning, Power Query) and Azure data services Soft Skills & Professional Attributes – Strong problem-solving and analytical mindset – Ability to communicate insights clearly to non-technical stakeholders – Team-oriented, detail-focused, and proactive in delivering high-quality reporting solutions
Posted 3 days ago
4.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Key Responsibilities Basic knowledge in Linux/Unix servers to ensure high availability and performance. Write and optimize joins, stored procedures, and triggers in relational databases. Work with monitoring and logging tools like Splunk, ELK, and AppDynamics for proactive issue resolution. Basic knowledge in application deployments and troubleshooting on servers like Tomcat, JBoss, and WebLogic. Automate recurring operational tasks using Shell scripting and Python. Implement and manage CI/CD pipelines with tools such as Jenkins and GitLab. Troubleshoot SSL certificates, firewalls, and proxy issues. Manage incidents, problem tickets, and change requests efficiently. Collaborate with cross-functional teams to understand business workflows and provide seamless support. Create and maintain detailed documentation for operational and troubleshooting processes. Perform API troubleshooting and provide basic support on Docker-based deployments. Effectively manage customer expectations while ensuring timely resolution of issues. Required Skills & Qualifications 2–4 years of experience in Application Support, or System Administration. Basic knowledge in Linux/Unix administration. Solid database skills with experience in joins, procedures, and triggers. Familiarity with Splunk, ELK, AppDynamics or similar monitoring tools. Basic knowledge with application servers like Tomcat, JBoss, WebLogic. Basic knowledge in scripting (Shell, Python) for automation. Experience with CI/CD tools such as Jenkins, GitLab. Good understanding of network components (SSL, firewalls, proxies). Strong troubleshooting, incident management, and documentation skills. Basic understanding of Docker and containerized environments. Excellent communication skills and ability to work directly with customers. What We Offer Opportunity to work on cutting-edge technologies. A collaborative and growth-oriented environment. Exposure to end-to-end business workflows and critical production systems. Competitive salary and benefits package. Skills: troubleshooting,elk,ci,appdynamics,cd,splunk,triggers
Posted 3 days ago
5.0 years
0 Lacs
hyderābād
On-site
Job Summary -India - Mumbai MATRIX-IFS is a dynamic, fast-growing technical consultancy specializing in AML/fraud detection, trading surveillance consulting, and systems integration for Banks, Brokerage, and Financial Services companies. We are a team of talented, creative, and dedicated individuals passionate about delivering innovative solutions to the market. We source and foster the best talent, recognizing that every employee's contributions are integral to our company's future. Our success is built on a challenging work environment, competitive compensation and benefits, and rewarding career opportunities. We encourage a work environment of sharing, learning, and succeeding together. Position Description: We are seeking an experienced Actimize Developer (SAM/CDD) to join our team. Key Responsibilities: Implement Actimize SAM 8.x or 9.x solutions. Develop custom models using stored procedures, rule cloning, policy manager, and query rules. Install and configure Actimize Case Management System. Install and configure Actimize AIS/RCM components along with SAM 8.x or 9.x solutions. Understand and implement AML transaction monitoring solutions and rules/scenarios. Write complex SQL queries, joins, and stored procedures. Provide excellent presentation and communication skills. Required Skills and Experience: 5+ years of experience in implementing Actimize SAM 8.x or 9.x solutions. Experience in developing custom models using stored procedures, rule cloning, policy manager, and query rules. Hands-on experience with installation and configuration of Actimize Case Management System. Hands-on experience with installation and configuration of Actimize AIS/RCM components along with SAM 8.x or 9.x solutions. Good understanding of AML transaction monitoring solutions and rules/scenarios. Proficiency in writing complex SQL queries, joins, and stored procedures. Excellent presentation and communication skills. Benefits: Competitive base salary. Comprehensive benefits package, including medical, dental, 401K, STD, HSA, PTO, and more. Join Us: Come and join a winning team! You'll be challenged, have fun, and be part of a highly respected organization. Matrix is a global, dynamic, fast-growing technical consultancy leading technology services company with 13000 employees worldwide. Since its foundation in 2001, Matrix has made more travelers and acquisitions and has executed some of the largest, most significant. The company specializes in implementing and developing leading technologies, software solutions, and products. It provides its customers with infrastructure and consulting services, IT outsourcing, offshore, training and assimilation, and Ves as representatives for the world's leading software vendors. With vast experience in private and public sectors, ranging from Finance, Telecom, Health, Hi-Tech, Education, Defense, and Secu city, Matrix's customer base includes guest organizations in Israel and a steadily growing client base worldwide. We are comprised of talented, creative, and dedicated individuals passionate about delivering innovative solutions to the market. We source and foster the best talent and recognize that all employee's contributions are integral to our company's future. Matrix- success is based on a challenging work environment, competitive compensation and benefits, and rewarding career opportunities. We encourage a diverse work environment of sharing, learning, and ceding together. Come and join the winning team! You'll be challenged and have fun in a highly respected organization. To Learn More, Visit: www.matrix-ifs.com EQUAL OPPORTUNITY EMPLOYER: Matrix is an Equal Opportunity Employer and Prohibits Discrimination and Harassment of Any Kind. Matrix is committed to the principle of equal employment opportunity for all employees, providing employees with a work environment free of discrimination and harassment. All employment decisions at Matrix are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion or belief, family or parental status, or any other status protected by the laws or regulations in our locations. Matrix will not tolerate discrimination or harassment based on any of these characteristics. Matrix encourages applicants of all ages .
Posted 4 days ago
8.0 years
2 - 8 Lacs
hyderābād
On-site
Job Description: Role: Senior Specialist Cybersecurity - Control Testing About the Company: Join AT&T and reimagine the communications and technologies that connect the world. Our Chief Security Office ensures that our assets are safeguarded through truthful transparency, enforce accountability and master cybersecurity to stay ahead of threats. Bring your bold ideas and fearless risk-taking to redefine connectivity and transform how the world shares stories and experiences that matter. When you step into a career with AT&T, you won’t just imagine the future-you’ll create it. About the Job: The Control Testing & Reporting (CTR) team is part of Chief Security Office (CSO) and responsible for testing information technology and information security controls owned and operated by AT&T Technology Services (ATS), which includes CSO. This Senior Specialist joins the CTR team to work and collaborate with our control owners, control operators and technology leadership to identify gaps in design and operating effectiveness of AT&T’s controls. The professional in this role will independently test IT General Controls (ITGC), Cloud security, Critical application security, and other information technology and information security controls necessary for regulatory compliance (e.g., SOX, PCI DSS). Experience Level: 8+ years. Location: Bengaluru and Hyderabad Responsibilities Include: Working with CTR team leadership to understand the need for control testing, and support them with prioritizing & planning, annual test plan preparation activities. Independently executing complex engagements assigned from the annual testing plan or other discrete engagements (test script preparation, walkthroughs, issue identification, obtaining stakeholder agreement, documentation and reporting them to senior ATS stakeholders) within the expected timelines and quality parameters, while working closely with external auditors, and other internal stakeholders where necessary, for better efficiency. Providing analysis of complex information technology and security issues and provides clear articulation of risk to AT&T assets (devices, networks, applications & data), and customers. Also, supporting periodic articulation of risk to ATS’s objectives using the test results and open issues by the Reporting team. Mentoring and supporting junior team members with advice and training. Supporting the development of a Control Testing Methodology and other key components of Technology Risk Management Framework (TRMF) and tooling that are related to or impact control testing. Required skills: Minimum 8 years’ experience in Technology Risk Management or Consulting or Assurance with at least 5 of those years in design or testing of controls in the areas of information technology and information security (SOX / ITGC / Critical application security / Cloud security) Strong understanding of regulatory requirements like SOX, PCIDSS etc. Strong documentation and effective articulation skills. Desirable skills: Bachelor's degree in computer science, Mathematics, Information Systems, Engineering or Cyber Security. Prior experience with Telecom sector ISACA, ISC2 or other relevant certifications. Additional information (if any): Need to be flexible to provide coverage in US morning hours. Weekly Hours: 40 Time Type: Regular Location: Bangalore, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.
Posted 4 days ago
1.0 years
4 - 9 Lacs
hyderābād
On-site
DESCRIPTION The candidate would be responsible for maintaining/refreshing WBRs and other analytical frameworks setup by senior analysts. They would also be required to build simple reports, take up dive deep requests, make changes to existing analytical frameworks and provide adhoc data support to Ops stakeholders. The person should have a good understanding of a business requirement and the ability to quickly get to the root cause of a particular reporting/BI/data issue, and draft solutions for resolution. The ideal candidate would be high on attention to detail, bias for action and interest in analytics/BI/automation. Some of the key result areas include, but not limited to: Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Ensure data accuracy by validating data for new and resources. Work closely with stakeholders (internal/external) to understand and automate/enhance existing processes Should be open to learn and develop skillsets in the latest technologies and analytical techniques Should understand how data/analytical frameworks and their work translate to business on ground Should be able to come up with innovative ideas for new work or to improve existing work BASIC QUALIFICATIONS 1+ years of data analytics or automation experience Bachelor's degree Knowledge of data pipelining and extraction using SQL Knowledge of SQL and Excel at a moderate or advanced level Knowledge of SQL/Python/R, scripting, MS Excel, table joins, and aggregate analytical functions Expertize with visualization tools such as Quicksight, Tableau or Power BI PREFERRED QUALIFICATIONS Experience in Linux and AWS Services Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 4 days ago
3.0 years
6 - 12 Lacs
hyderābād
On-site
We are seeking an experienced and passionate Data Analytics Trainer to join our team. The ideal candidate will have hands-on expertise in Power BI, SQL, Excel, and Python (basics) , and a passion for teaching and mentoring students or professionals. You will be responsible for delivering high-quality training sessions, designing learning content, and helping learners build practical skills for real-world data analytics roles. Key Responsibilities: Deliver interactive and engaging classroom or online training sessions on: Power BI – dashboards, data modeling, DAX, visualization best practices. SQL – querying databases, joins, aggregations, subqueries. Excel – formulas, pivot tables, data cleaning, and analysis. Create and update training content, exercises, quizzes, and projects. Guide learners through hands-on assignments and real-time case studies. Provide feedback and mentorship to help learners improve their technical and analytical skills. Track and report learners' progress and performance. Stay updated with the latest tools, trends, and best practices in data analytics. Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Analytics, Statistics, or a related field. 3+ years of hands-on experience in data analysis and visualization. Proven training experience or passion for teaching. Strong command of: Power BI (certification is a plus) SQL (any RDBMS like MySQL, SQL Server, or PostgreSQL) Microsoft Excel (advanced level) Excellent communication and presentation skills. Patience, empathy, and a learner-focused mindset Job Types: Full-time, Permanent Pay: ₹50,000.00 - ₹100,000.00 per month Benefits: Health insurance Provident Fund Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Where are you from ? What is Your CTC and ECTC ? Work Location: In person
Posted 4 days ago
4.0 - 6.0 years
5 - 25 Lacs
gurgaon
On-site
Job Description: Job Title: Senior Tableau Developer Location : Gurgaon Experience : 4–6 Years Salary : Negotiable Job Summary: We need a Senior Tableau Developer with a minimum of 4 years to join our BI team. The ideal candidate will be responsible for designing, developing, and deploying business intelligence solutions using Tableau. Key Responsibilities: · Design and develop interactive and insightful Tableau dashboards and visualizations. · Optimize dashboards for performance and usability. · Work with SQL and data warehouses (Snowflake) to fetch and prepare clean data sets. · Gather and analyse business requirements, translate them into functional and technical specifications. · Collaborate with cross-functional teams to understand business KPIs and reporting needs. · Conduct unit testing and resolve data or performance issues. · Strong understanding of data visualization principles and best practices. Tech. Skills Required: · Proficient in Tableau Desktop (dashboard development, storyboards) · Strong command of SQL (joins, subqueries, CTEs, aggregation) · Experience with large data sets and complex queries · Experience working on any Data warehouse (Snowflake, Redshift) · Excellent analytical and problem-solving skills. Mail updated resume with current salary- Email: jobs@glansolutions.com Satish: 8802749743 Website: www.glansolutions.com Key Skill: Tableau Developer, Snowflake, SQL, Tableau Desktop, Retail domain, Data warehouse Job Type: Full-time Pay: ₹592,689.63 - ₹2,519,305.62 per year Benefits: Paid sick time Paid time off Provident Fund Application Question(s): Candidate Current CTC? Candidate Expected CTC? Candidate Notice Period? Experience: Tableau Developer : 4 years (Required) Location: Gurugram, Haryana (Required) Work Location: In person
Posted 4 days ago
4.0 years
5 - 25 Lacs
gurgaon
On-site
Position : Tableau developer Experience : 4+ year Location : Gurgaon (onsite) salary : Negotiable We need a Senior Tableau Developer with a minimum of 4 years to join our BI team. The ideal candidate will be responsible for designing, developing, and deploying business intelligence solutions using Tableau. Key Responsibilities: · Design and develop interactive and insightful Tableau dashboards and visualizations. · Optimize dashboards for performance and usability. · Work with SQL and data warehouses (Snowflake) to fetch and prepare clean data sets. · Gather and analyse business requirements, translate them into functional and technical specifications. · Collaborate with cross-functional teams to understand business KPIs and reporting needs. · Conduct unit testing and resolve data or performance issues. · Strong understanding of data visualization principles and best practices. Tech. Skills Required: · Proficient in Tableau Desktop (dashboard development, storyboards) · Strong command of SQL (joins, subqueries, CTEs, aggregation) · Experience with large data sets and complex queries · Experience working on any Data warehouse (Snowflake, Redshift) · Excellent analytical and problem-solving skills. please share resume with below details- current ctc- expected ctc- notice period- current location- Total exp- relevant exp in Tableau- email etalenthire@ gmail.com satish 88O2749743 Job Type: Full-time Pay: ₹592,689.63 - ₹2,519,305.62 per year Ability to commute/relocate: Gurgaon, Haryana: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Current CTC ? Expected CTC ? Notice period (in days, if serving then LWD) ? Current Location ? Would you be comfortable onsite job (Gurgaon) ? Experience: Business intelligence: 5 years (Preferred) Tableau Desktop: 5 years (Preferred) Work Location: In person
Posted 4 days ago
0 years
3 - 4 Lacs
haryāna
On-site
Organization: Udayan Care Location: Dundahera, Haryana Type: Full-Time / On site Sector: Education & Skill Development – NGO About Udayan Care: Udayan Care is a nationally recognized NGO committed to empowering underprivileged youth through quality education, vocational training, and life skills. Our Skill Development Centres aim to equip young individuals with industry-relevant skills to enhance employability and self-reliance. Position Overview: We are seeking a dedicated and knowledgeable IT Trainer for our Dundahera Skill Development Centre. The ideal candidate will train students in Power BI , SQL , and basic data analytics, preparing them for entry-level roles in the industry. Key Responsibilities: Conduct hands-on training sessions on: Power BI – Dashboards, DAX, data modeling, reporting SQL – Query writing, joins, subqueries, data extraction Develop and update training content aligned with industry needs. Assess student performance through assignments, projects, and evaluations. Provide mentorship and individual guidance to students. Work with the Centre Head and Placement Team to ensure job-readiness. Promote a learning environment that encourages participation and confidence. Qualifications & Skills: Bachelor’s degree in Computer Science, IT, or a related field. Proficiency in Power BI and SQL with practical experience. Prior teaching/training experience preferred. Excellent communication and classroom management skills. A strong passion for teaching and social impact. Preferred: Experience in NGOs or skill development programs. Familiarity with MS Excel (Advanced), Python (basic), or other analytics tools. Job Location: Udayan Care Skill Development Centre, Dundahera, Haryana Compensation: As per organizational norms and experience. How to Apply: Send your updated resume to recruitment@udayancare.org with the subject line: "Application for IT Trainer – Power BI & SQL | Dundahera" Contact Person: Ms. Anam Khan Udayan Care Recruitment Team Job Type: फ़ुल-टाइम Pay: ₹30,000.00 - ₹35,000.00 per month Benefits: प्रॉविडेंट फ़ंड Work Location: In person
Posted 4 days ago
4.0 years
10 - 20 Lacs
india
On-site
Experience- 4-6 years Location- Bangalore, Noida Shift: 1pm – 10pm Role& Responsibilities: Support and maintenance of an existing application written predominately using Sybase Stored Procedures. Perform database administration tasks: monitoring, tuning and performance optimization, troubleshooting database related problems. Stored procedure development involving fixing problems and developing new functionality. Unit testing and debugging stored procedures. Support with migrating business logic in stored procedures to Java. Experience/Qualifications: 3+ years’ experience working as a Database Developer. Strong knowledge in database programming using Stored Procedures / procedural SQL. Preferably on Transact SQL (Sybase, MS SQL Server). Comfortable developing and analysing procedural SQL, using cursors, variables, loops, complex joins, subqueries, temporary and intermediate tables. Experience in Query Optimisation, Debugging, and Performance Tuning. Nice to have: Git, Powershell, Java experience Strong written and verbal communication. Job Type: Full-time Pay: ₹1,000,000.00 - ₹2,000,000.00 per year Application Question(s): How many years of total experience do you currently have? How many years of experience do you have as a database developer? How many years of experience do you have in Sybase Stored Procedures? What is your current annual CTC? What is your expected annual CTC? What is your notice period (in days)/ remaining days if serving the notice period? Have you applied or attended the interview in Birlasoft? Are you comfortable with 1pm – 10pm shift? What is your current and preferred location?
Posted 4 days ago
0 years
1 - 1 Lacs
india
On-site
Position: Python Developer Intern Location: Indore - Onsite Duration: 3 months Stipend: Up to 10k About the Role: We are looking for a motivated and detail-oriented Python Developer Intern with strong SQL skills to join our team. In this role, you will collaborate with experienced developers and data engineers to build, maintain, and optimize applications and databases. You will gain hands-on experience in developing scalable software solutions, writing efficient SQL queries, and contributing to real-world projects. Key Responsibilities: Python Development: Write clean, efficient, and reusable Python code. Assist in developing and maintaining backend services, scripts, and APIs. Troubleshoot and debug issues in existing code. SQL Development: Write and optimize SQL queries for data retrieval and manipulation. Assist in database design, normalization, and performance tuning. Ensure data accuracy and integrity. Collaboration & Documentation: Work closely with the development and data teams to integrate Python applications with databases. Document code, processes, and project-related information. Testing & Validation: Assist in writing test cases and performing unit tests. Validate data outputs and application functionality. Required Skills & Qualifications: Programming Skills: Proficiency in Python (basic to intermediate level). Familiarity with libraries such as Pandas, NumPy, Flask, Django , etc., is a plus. SQL Skills: Strong knowledge of SQL queries, joins, and data manipulation. Experience with MySQL, PostgreSQL, SQL Server , or similar databases. Problem-Solving: Strong analytical and problem-solving abilities. Version Control: Familiarity with Git/GitHub for version control. Soft Skills: Eagerness to learn, adaptability, and a team-player attitude. Preferred Qualifications: Knowledge of data visualization libraries (e.g., Matplotlib, Seaborn) is a plus. Basic understanding of REST APIs and web frameworks. Familiarity with cloud platforms (AWS, Azure, or GCP) is a bonus. What You’ll Gain: Hands-on experience with real-world projects. Mentorship from industry professionals. Exposure to agile development practices . Potential for full-time employment based on performance. Job Type: Full-time Pay: ₹8,415.99 - ₹10,000.77 per month Benefits: Flexible schedule Paid sick time Work Location: In person
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |