Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Role Summary: Solutioning lead for Data Engineering - AWS and Snowflake as primary stack Role Responsibilities Architecture and Solutioning on AWS and Snowflake platforms - - data warehouse, lakehouse, data fabric and datamesh Sizing ,Estimation and Implementation plan for solutioning Solution Prototyping, Advisory and orchestrating in-person/remote workshops Work with hyperscalers and platform vendors to understand and test platform roadmaps and develop joint solutions Own end-to-end solutions working across various teams in Cognizant - Sales, Delivery and Global solutioning Own key accounts as Architecture advisory and establish deep client relationships Contribute to practice by developing reusable assets and solutions Job Requirements Bachelor’s or Master’s degree in computer science, engineering, information systems or a related field Minimum 15 years’ experience as Solution Architect designing and developing data architecture patterns Minimum 5-year hands-on experience in building AWS & Snowflake based solutions Minimum 3 years’ experience as Solution Architect in pre-sales team driving the sales process from a technical solution standpoint Excellent verbal and written communication skills with ability to present complex Cloud Data Architecture solutions concepts to technical and executive audience (leveraging PPTs, Demos and Whiteboard) Deep expertise in designing AWS and Snowflake Strong expertise in handling large and complex RFPs/RFIs and collaborating with multiple service lines & platform vendors in a fast-paced environment Strong relationship building skills and ability to provide technical advisory and guidance Technology architecture & implementation experience with deep implementation experience with Data solution s 15~20 years of experience in Data Engineering and 5+ Years Data Engineering Experience on cloud data engineering Technology pre sales experience – Architecture, Effort sizing , Estimation and Solution defense Data architecture patterns– Data Warehouse , Data Lake , Data Mesh , Lake house , Data as a product Develop or Co-develop proofs of concept and prototypes with customer teams Excellent understanding of distributed computing fundamentals Experience Working With One Or More Major Cloud Vendors Deep expertise on End to End Pipeline ( or ETL) development following best practices and including orchestration, Optimization of Data pipelines Strong understanding of the full CI/CD lifecycle Large legacy migration ( Hadoop , Terdata like) experience to Cloud Data platforms Expert level proficiency in engineering & optimizing with various data engineering ingestion patterns - Batch, Micro Batch, Streaming and API Understand imperatives of change data capture with tools & best practices POV Architect and Solution Data Governance capability pillars supporting modern data eco system Data services and various consumption archetypes including semantic layers, BI tools and AI&ML Thought leadership designing self-service data engineering platforms & solutions Core Platform – AWS & Snowflake Ability to engage and offer differing points of view to customers architecture using AWS and Snowflake platform Strong understanding of the Snowflake platform including evolving services like Snowpark Implementation expertise using AWS services – EMR , Redshift , Glue , Kinesis , Lambda, AWS Lake formation and Snowflake Security design and implementation on AWS & Snowflake Pipelines development in multi-hop pipeline architecture Architecture and Implementation experience with Spark and Snowflake performance tuning including topics such as cluster sizing Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Your Skills & Experience: Strong expertise in Data Engineering highly recommended. • Overall experience of 4+years of relevant experience in Big Data technologies • Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage. • Strong experience in at least of the programming language Java, Scala, Python. Java preferable • Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc. • Well-versed and working knowledge with data platform related services on Azure/GCP. • Bachelor’s degree and year of work experience of 4+ years or any combination of education, training and/or experience that demonstrates the ability to perform the duties of the position Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
India
Remote
Job Title: Senior Data Engineer (Remote) Location: Remote Employment Type: Full-Time Experience Level: Senior (5+ years) About the Role We are seeking a highly skilled Senior Data Engineer to join our growing data team. As a key contributor, you will design and build robust, scalable data pipelines and systems that power analytics and decision-making across the organization. You will work closely with data scientists, analysts, and product teams to ensure data accuracy, availability, and performance. Key Responsibilities Design, build, and maintain scalable and reliable ETL/ELT data pipelines using Python and SQL. Develop and manage data infrastructure and workflows on AWS (e.g., S3, Lambda, Glue, Redshift, EMR, Athena). Ensure high data quality and implement best practices for data governance and security. Automate data ingestion from diverse structured and unstructured sources. Optimize and monitor pipeline performance and resolve production issues. Collaborate with stakeholders to define data requirements and deliver actionable data products. Maintain and document architecture, data models, and pipelines. Mentor junior engineers and contribute to engineering best practices and team culture. Required Qualifications 5+ years of experience in data engineering or a related field. Strong proficiency in Python for data processing and scripting. Advanced knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL). Hands-on experience with AWS data services: S3, Glue, Redshift, Lambda, Athena, etc. Experience with orchestration tools (e.g., Airflow, AWS Step Functions). Solid understanding of data warehousing, data modeling, and ETL best practices. Familiarity with version control (e.g., Git) and CI/CD pipelines. Preferred Qualifications Experience with infrastructure as code tools (e.g., Terraform, CloudFormation). Exposure to real-time data processing frameworks (e.g., Kafka, Spark, Kinesis). Background in big data technologies and distributed computing. Knowledge of data privacy regulations (GDPR, CCPA) and compliance practices. Familiarity with dashboarding/BI tools (e.g., Tableau, Looker, QuickSight). Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications Bachelor's degree in Engineering or a related field. 3 to 5 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Show more Show less
Posted 1 week ago
15.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. Those in artificial intelligence and machine learning at PwC will focus on developing and implementing advanced AI and ML solutions to drive innovation and enhance business processes. Your work will involve designing and optimising algorithms, models, and systems to enable intelligent decision-making and automation. Years of Experience: Candidates with 15+ years of hands on experience Preferred Skills Experience in GCP and one more cloud ( AWS/Azure) platform specifically in data migration experience Use an analytical and data driven approach to drive a deep understanding of fast changing business Have familiarity with data technologies such as snowflake, databricks, redshift, synapse Leading large-scale data modernization and governance initiatives emphasizing the strategy, design and development of Platform-as-a-Service, Infrastructure-as-a-Service that extend to private and public cloud deployment models Experience in designing, architecting, implementation, and managing data lakes/warehouses. Experience with complex environment delivering application migration to cloud platform Understanding of Agile, SCRUM and Continuous Delivery methodologies Hands-on experience with Docker and Kubernetes or other container orchestration platforms Strong Experience in data management with an understanding of analytics and reporting Understanding of emerging technologies and latest data engineering providers Experience in implementing enterprise data concepts such as Master Data Management and Enterprise Data Warehouse, experience with MDM standards. Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA Show more Show less
Posted 1 week ago
15.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. Those in artificial intelligence and machine learning at PwC will focus on developing and implementing advanced AI and ML solutions to drive innovation and enhance business processes. Your work will involve designing and optimising algorithms, models, and systems to enable intelligent decision-making and automation. Years of Experience: Candidates with 15+ years of hands on experience Preferred Skills any cloud data and reporting migration experience Use an analytical and data driven approach to drive a deep understanding of fast changing business Have familiarity with data technologies such as snowflake, databricks, redshift, synapse Leading large-scale data modernization and governance initiatives emphasizing the strategy, design and development of Platform-as-a-Service, Infrastructure-as-a-Service that extend to private and public cloud deployment models Experience in designing, architecting, implementation, and managing data lakes/warehouses. Experience with complex environment delivering application migration to cloud platform Understanding of Agile, SCRUM and Continuous Delivery methodologies Hands-on experience with Docker and Kubernetes or other container orchestration platforms Strong Experience in data management with an understanding of analytics and reporting Understanding of emerging technologies and latest data engineering providers Experience in implementing enterprise data concepts such as Master Data Management and Enterprise Data Warehouse, experience with MDM standards. Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Amazon IN Platform Development team is looking to hire a rock star Data/BI Engineer to build for pan Amazon India businesses. Amazon India is at the core of hustle @ Amazon WW today and the team is charted with democratizing data access for the entire marketplace & add productivity. That translates to owning the processing of every Amazon India transaction, for which the team is organized to have dedicated business owners & processes for each focus area. The BI Engineer will play a key role in contributing to the success of each focus area, by partnering with respective business owners and leveraging data to identify areas of improvement & optimization. He / She will build deliverables like business process automation, payment behavior analysis, campaign analysis, fingertip metrics, failure prediction etc. that provide edge to business decision making AND can scale with growth. The role sits in the sweet spot between technology and business worlds AND provides opportunity for growth, high business impact and working with seasoned business leaders. An ideal candidate will be someone with sound technical background in data domain – storage / processing / analytics, has solid business acumen and a strong automation / solution oriented thought process. Will be a self-starter who can start with a business problem and work backwards to conceive & devise best possible solution. Is a great communicator and at ease on partnering with business owners and other internal / external teams. Can explore newer technology options, if need be, and has a high sense of ownership over every deliverable by the team. Is constantly obsessed with customer delight & business impact / end result and ‘gets it done’ in business time. Key job responsibilities Design, implement and support an data infrastructure for analytics needs of large organization Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big data technologies Be enthusiastic about building deep domain knowledge about Amazon’s business. Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. Enjoy working closely with your peers in a group of very smart and talented engineers. Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency About The Team India Data Engineering and Analytics (IDEA) team is central data engineering team for Amazon India. Our vision is to simplify and accelerate data driven decision making for Amazon India by providing cost effective, easy & timely access to high quality data. We achieve this by building UDAI (Unified Data & Analytics Infrastructure for Amazon India) which serves as a central data platform and provides data engineering infrastructure, ready to use datasets and self-service reporting capabilities. Our core responsibilities towards India marketplace include a) providing systems(infrastructure) & workflows that allow ingestion, storage, processing and querying of data b) building ready-to-use datasets for easy and faster access to the data c) automating standard business analysis / reporting/ dashboarding d) empowering business with self-service tools for deep dives & insights seeking. Basic Qualifications 3+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Preferred Qualifications Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2984254 Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description Do you enjoy diving deep into data, building data models and developing business metrics to generate actionable insights? Are you looking for an opportunity to define end to end analytics roadmap, work with cross functional teams and leverage cutting edge modern technologies and cloud solutions to develop analytics products. DSP Analytics team has an exciting opportunity for a Business Intelligence Engineer (BIE) to improve Amazon’s Delivery Service Partner (DSP) program through impactful data solutions. The goal of Amazon’s DSP organization is to exceed the expectations of our customers by ensuring that their orders, no matter how large or small, are delivered as quickly, accurately, and cost effectively as possible. To meet this goal, Amazon is continually striving to innovate and provide best in class delivery experience through the introduction of pioneering new products and services in the last mile delivery space. We are looking for an innovative, highly-motivated and experienced BIE who can think holistically about problems to understand how systems work together to identify and execute both tactical and strategic projects. You will work closely with engineering teams, product managers, program managers and org leaders to deliver end-to-end data solutions aimed at continuously enhancing overall DSP performance and delivery quality. The business coverage is broad, and you will identify and prioritize what matters most for the business, quantify what is (or is not) working, invent and simplify the current process and develop self-serve data and reporting solutions. You should have excellent business and communication skills to be able to work with business owners to define roadmap, develop milestones, define key business questions, and build data-sets that answers those questions. The ideal candidate should have hands-on SQL and scripting language experience and excel in designing, implementing, and operating stable, scalable, low-cost solutions to flow data from production systems into the data warehouse and into end-user facing applications. Key job responsibilities Lead the design, implementation, and delivery of BI solutions for the Worldwide DSP Performance solutions and Network Health. Manage and execute entire projects from start to finish including stakeholder management, data gathering and manipulation, modeling, problem solving, and communication of insights and recommendations. Extract, transform, and load data from many data sources using SQL, Scripting and other ETL tools. Design, build, and maintain automated reporting, dashboards, and ongoing analysis to enable data driven decisions across our team and with partner teams. Report key insight trends using statistical rigor to simplify and inform the larger team of noteworthy trends that impact the business. Retrieve and analyze data using a broad set of Amazon’s data technologies (ex. Redshift, AWS S3, Amazon Internal Platforms/Solutions) and resources, knowing how, when, and which to use. Earn the trust of your customers and stakeholders by continuing to constantly obsess over their business use cases and data needs, and helping them solve their problems by leveraging technology. Work closely with business stakeholders and senior leadership team to review roadmap and contributing to business strategy and how they can leverage analytics for success. About The Team We are the core Amazon DSP BI team with the vision to enable data, insights and science driven decision-making. We have exceptionally talented and fun loving team members. In our team, you will have the opportunity to dive deep into complex business and data problems, drive large scale technical solutions and raise the bar for operational excellence. We love to share ideas and learning with each other. We are a relatively new team and do not carry legacy operational burden. We believe in promoting and using ideas to disrupt the status quo. Per the internal transfers guidelines, please reach out to the hiring manager for an informational through the "Request Informational" button on the job page. Basic Qualifications 10+ years of professional or military experience 7+ years of SQL experience Experience programming to extract, transform and clean large (multi-TB) data sets Experience with theory and practice of design of experiments and statistical analysis of results Experience with AWS technologies Experience in scripting for automation (e.g. Python) and advanced SQL skills. Experience with theory and practice of information retrieval, data science, machine learning and data mining Preferred Qualifications Experience working directly with business stakeholders to translate between data and business needs Experience managing, analyzing and communicating results to senior leadership Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2906908 Show more Show less
Posted 1 week ago
12.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do Define and design future state data architecture for HR reporting, forecasting and analysis products. Partner with Technology, Data Stewards and various Products teams in an Agile work stream while meeting program goals and deadlines. Engage with line of business, operations, and project partners to gather process improvements. Lead to design / build new models to efficiently deliver the financial results to senior management. Evaluate Data related tools and technologies and recommend appropriate implementation patterns and standard methodologies to ensure our Data ecosystem is always modern. Collaborate with Enterprise Data Architects in establishing and adhering to enterprise standards while also performing POCs to ensure those standards are implemented. Provide technical expertise and mentorship to Data Engineers and Data Analysts in the Data Architecture. Develop and maintain processes, standards, policies, guidelines, and governance to ensure that a consistent framework and set of standards is applied across the company. Create and maintain conceptual / logical data models to identify key business entities and visual relationships. Work with business and IT teams to understand data requirements. Maintain a data dictionary consisting of table and column definitions. Review data models with both technical and business audience What You'll Bring Essential Education Minimum of a Bachelor's degree in Computer science, Engineering or a similar field Additional Certification in Data Management or cloud data platforms like Snowflake preferred Essential Experience & Job Requirements 12+ years of IT experience with major focus on data warehouse/database related projects Expertise in cloud databases like Snowflake, Redshift etc. Expertise in Data Warehousing Architecture; BI/Analytical systems; Data cataloguing; MDM etc Proficient in Conceptual, Logical, and Physical Data Modelling Proficient in documenting all the architecture related work performed. Proficient in data storage, ETL/ELT and data analytics tools like AWS Glue, DBT/Talend, FiveTran, APIs, Tableau, Power BI, Alteryx etc Experience in building Data Solutions to support Comp Benchmarking, Pay Transparency / Pay Equity and Total Rewards use cases preferred. Experience with Cloud Big Data technologies such as AWS, Azure, GCP and Snowflake a plus Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs) a plus Excellent written, oral communication and presentation skills to present architecture, features, and solution recommendations is a must Additional info YOU’RE GOOD AT Design, document & train the team on the overall processes and process flows for the Data architecture. Resolve technical challenges in critical situations that require immediate resolution. Develop relationships with external stakeholders to maintain awareness of data and security issues and trends. Review work from other tech team members and provide feedback for growth. Implement Data security policies that align with governance objectives and regulatory requirements. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Us Yubi stands for ubiquitous. But Yubi will also stand for transparency, collaboration, and the power of possibility. From being a disruptor in India’s debt market to marching towards global corporate markets from one product to one holistic product suite with seven products Yubi is the place to unleash potential. Freedom, not fear. Avenues, not roadblocks. Opportunity, not obstacles. About Yubi Yubi, formerly known as CredAvenue, is re-defining global debt markets by freeing the flow of finance between borrowers, lenders, and investors. We are the world's possibility platform for the discovery, investment, fulfillment, and collection of any debt solution. At Yubi, opportunities are plenty and we equip you with tools to seize it. In March 2022, we became India's fastest fintech and most impactful startup to join the unicorn club with a Series B fundraising round of $137 million. In 2020, we began our journey with a vision of transforming and deepening the global institutional debt market through technology. Our two-sided debt marketplace helps institutional and HNI investors find the widest network of corporate borrowers and debt products on one side and helps corporates to discover investors and access debt capital efficiently on the other side. Switching between platforms is easy, which means investors can lend, invest and trade bonds - all in one place. All of our platforms shake up the traditional debt ecosystem and offer new ways of digital finance. Yubi Credit Marketplace - With the largest selection of lenders on one platform, our credit marketplace helps enterprises partner with lenders of their choice for any and all capital requirements. Yubi Invest - Fixed income securities platform for wealth managers & financial advisors to channel client investments in fixed income Financial Services Platform - Designed for financial institutions to manage co-lending partnerships & asset based securitization Spocto - Debt recovery & risk mitigation platform Corpository - Dedicated SaaS solutions platform powered by Decision-grade data, Analytics, Pattern Identifications, Early Warning Signals and Predictions to Lenders, Investors and Business Enterprises So far, we have on-boarded over 17000+ enterprises, 6200+ investors & lenders and have facilitated debt volumes of over INR 1,40,000 crore. Backed by marquee investors like Insight Partners, B Capital Group, Dragoneer, Sequoia Capital, LightSpeed and Lightrock, we are the only-of-its-kind debt platform globally, revolutionizing the segment. At Yubi, People are at the core of the business and our most valuable assets. Yubi is constantly growing, with 1000+ like-minded individuals today, who are changing the way people perceive debt. We are a fun bunch who are highly motivated and driven to create a purposeful impact. Come, join the club to be a part of our epic growth story. About The Role As a Senior DevOps Engineer, you will be part of a highly talented DevOps team who manages the entire infrastructure for Yubi. You will work with development teams to understand their requirements, optimize them to reduce costs, create scripts for creating and configuring them, maintain and monitor the infrastructure. As a financial services firm, security is of utmost concern to our firm and you will ensure that all data handled by the entire platform, key configurations, passwords etc. are secure from leaks. You will ensure that the platform is scaled to meet our user needs and optimally performing at all times and our users get a world class experience using our software products. You will ensure that data, source code and configurations are adequately backed up and prevent loss of data. You will be well versed in tools to automate all such DevOps tasks. Responsibilities Troubleshoot web and backend applications and issues. Good understanding on multi-tier applications. Knowledge on AWS security, Application security, security best practices. SCA analysis, analyzing the security reports, sonarqube profiles and gates. Able to draft solutions to improve security based on reporting. Lead, drive and implement highly scalable, highly available and complex solutions. Up to date with latest devops tools and ecosystem. Excellent written and verbal communication. Requirements Bachelor’s/Master’s degree in Computer Science or equivalent work experience 3-6 years of working experience as DevOps engineer AWS Cloud expertise is a must and primary. Azure/GCP cloud knowledge is a plus. Extensive knowledge and experience with major AWS Services. Advanced AWS networking setup, routing, vpn, cross account networking, use of proxies. Experience with AWS multi account infrastructure. Infrastructure as code using cloudformation or terraform. Containerization – docker/kubernetes/ecs/fargate. Configuration management tools such as chef/ansible/salt. CI/CD - Jenkins/Code pipeline/Code deploy. Basic expertise in scripting languages such as shell, python or nodejs. Adept at Continuous Integration/Continuous Deployment Experience working with source code repos like Gitlab, Github or Bitbucket. Monitoring tools: cloudwatch agent, prometheus, grafana, newrelic, Dynatrace, datadog, openapm..etc ELK knowledge is a plus. Knowledge on chat-ops. Adept at using various operating systems, Windows, Mac and Linux Expertise in using command line tools, AWS CLI, Git, or other programming aws apis. Experience with both sql (rds postgres/mysql) and no-sql databases (mongo), data warehousing (redshift), datalake. Knowledge and experience in instrumentation, metrics and monitoring concepts. Benefits YUBI is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, or age. Show more Show less
Posted 1 week ago
8.0 - 10.0 years
2 - 3 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-216615 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jun. 06, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Senior Manager Information Systems What you will do Let’s do this. Let’s change the world. n this vital role you will develop an insight driven sensing capability with a focus on revolutionizing decision making. In this role you will lead the technical delivery for this capability as part of a team data engineers and software engineers. The team will rely on your leadership to own and refine the vision, feature prioritization, partner alignment, and experience leading solution delivery while building this ground-breaking new capability for Amgen. You will drive the software engineering side of the product release and will deliver for the outcomes. Roles & Responsibilities: Lead delivery of overall product and product features from concept to end of life management of the product team comprising of technical engineers, product owners and data scientists to ensure that business, quality, and functional goals are met with each product release Drives excellence and quality for the respective product releases, collaborating with Partner teams. Impacts quality, efficiency and effectiveness of own team. Has significant input into priorities. Incorporate and prioritize feature requests into product roadmap; Able to translate roadmap into execution Design and implement usability, quality, and delivery of a product or feature Plan releases and upgrades with no impacts to business Hands on expertise in driving quality and best in class Agile engineering practices Encourage and motivate the product team to deliver innovative and exciting solutions with an appropriate sense of urgency Manages progress of work and addresses production issues during sprints Communication with partners to make sure goals are clear and the vision is aligned with business objectives Direct management and staff development of team members What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 8 to 10 years of Information Systems experience OR Bachelor’s degree and 10 to 14 years ofInformation Systems experience OR Diploma and 14 to 18 years of Information Systems experience Thorough understanding of modern web application development and delivery, Gen AI applications development, Data integration and enterprise data fabric concepts, methodologies, and technologies e.g. AWS technologies, Databricks Demonstrated experience in building strong teams with consistent practices. Demonstrated experience in navigating matrix organization and leading change. Prior experience writing business case documents and securing funding for product team delivery; Financial/Spend management for small to medium product teams is a plus. In-depth knowledge of Agile process and principles. Define success metrics for developer productivity metrics; on a monthly/quarterly basis analyze how the product team is performing against established KPI’s. Functional Skills: Leadership: Influences through Collaboration : Builds direct and behind-the-scenes support for ideas by working collaboratively with others. Strategic Thinking : Anticipates downstream consequences and tailors influencing strategies to achieve positive outcomes. Transparent Decision-Making : Clearly articulates the rationale behind decisions and their potential implications, continuously reflecting on successes and failures to enhance performance and decision-making. Adaptive Leadership : Recognizes the need for change and actively participates in technical strategy planning. Preferred Qualifications: Strong influencing skills, influence stakeholders and be able to balance priorities. Prior experience in vendor management. Prior hands-on experience leading full stack development using infrastructure cloud services (AWS preferred) and cloud-native tools and design patterns (Containers, Serverless, Docker, etc.) Experience with developing solutions on AWS technologies such as S3, EMR, Spark, Athena, Redshift and others Familiarity with cloud security (AWS /Azure/ GCP) Conceptual understanding of DevOps tools (Ansible/ Chef / Puppet / Docker /Jenkins) Professional Certifications AWS Certified Solutions Architect (preferred) Certified DevOps Engineer (preferred) Certified Agile Leader or similar (preferred) Soft Skills: Strong desire for continuous learning to pick new tools/technologies. High attention to detail is essential with critical thinking ability. Should be an active contributor on technological communities/forums Proactively engages with cross-functional teams to resolve issues and design solutions using critical thinking and analysis skills and best practices. Influences and energizes others toward the common vision and goal. Maintains excitement for a process and drives to new directions of meeting the goal even when odds and setbacks render one path impassable Established habit of proactive thinking and behavior and the desire and ability to self-start/learn and apply new technologies Excellent organizational and time-management skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
5.0 years
0 Lacs
Hyderābād
On-site
Zenoti provides an all-in-one, cloud-based software solution for the beauty and wellness industry. Our solution allows users to seamlessly manage every aspect of the business in a comprehensive mobile solution: online appointment bookings, POS, CRM, employee management, inventory management, built-in marketing programs and more. Zenoti helps clients streamline their systems and reduce costs, while simultaneously improving customer retention and spending. Our platform is engineered for reliability and scale and harnesses the power of enterprise-level technology for businesses of all sizes Zenoti powers more than 30,000 salons, spas, medspas and fitness studios in over 50 countries. This includes a vast portfolio of global brands, such as European Wax Center, Hand & Stone, Massage Heights, Rush Hair & Beauty, Sono Bello, Profile by Sanford, Hair Cuttery, CorePower Yoga and TONI&GUY. Our recent accomplishments include surpassing a $1 billion unicorn valuation, being named Next Tech Titan by GeekWire, raising an $80 million investment from TPG, ranking as the 316th fastest-growing company in North America on Deloitte's 2020 Technology Fast 500™. We are also proud to be recognized as a Great Place to Work CertifiedTM for 2021-2022 as this reaffirms our commitment to empowering people to feel good and find their greatness. To learn more about Zenoti visit: https://www.zenoti.com Our products are built on Windows .NET and SQL Server and managed in AWS. Our web Ux stack is built on jQuery and some areas use AngularJS. Our middle tier is in C# and we build our infrastructure on an extensive set of Restful APIs. We build native iOS and Android apps, and are starting to experiment with Flutter and Dart. For select infrastructure components we use Python extensively, and use Tableau for analytics dashboards. We use Redshift, Aurora, Redis Elasticache, Lambda, and other AWS and Azure products to build and manage our complete service, moving towards serverless components. We deal with billions of API calls, millions of records in databases, and terabytes of data to be managed with all services we build that have to run 24x7 at 99.99% availability. What will I be doing? Design, develop, test, release and maintain components of Zenoti Collaborate with a team of PM, DEV, and QA to release features Work in a team following agile development practices (SCRUM) Build usable software, released at high quality, runs at scale and is adopted by customers Learn to scale your features to handle 2x ~ 4x growth every year and manage code that has to deal with millions of records, and terabytes of data Release new features into production every month, and get real feedback from thousands of customers to refine your designs Be proud of what you work on, obsess about the quality of the work you produce What skills do I need? 5+ years experience of working on /iOS/Android to build mobile apps. 1+ years of experience in Flutter Strong experience in Swift/Java/kotlin. Experience in creating mobile app workflows, storyboards, user flows. Proven experience in writing readable code, creating extensive documentation for existing code and refactoring previously written code. Experience working in an Agile/Scrum development process. Experience with third-party libraries and APIs. Strong and demonstrated ability to design modules for Mobile applications. Strong logical, analytical, and problem-solving skills Excellent communication skills Can work in a fast-paced, ever-changing startup environment Benefits Attractive Compensation Comprehensive medical coverage for yourself and your immediate family An environment where wellbeing is high on priority – access to regular yoga, meditation, breathwork, nutrition counseling, stress management, inclusion of family for most benefit awareness building sessions Opportunities to be a part of a community and give back: Social activities are part of our culture; You can look forward to regular engagement, social work, community give-back initiatives Zenoti provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.
Posted 1 week ago
0 years
5 - 7 Lacs
Hyderābād
On-site
Zenoti provides an all-in-one, cloud-based software solution for the beauty and wellness industry. Our solution allows users to seamlessly manage every aspect of the business in a comprehensive mobile solution: online appointment bookings, POS, CRM, employee management, inventory management, built-in marketing programs and more. Zenoti helps clients streamline their systems and reduce costs, while simultaneously improving customer retention and spending. Our platform is engineered for reliability and scale and harnesses the power of enterprise-level technology for businesses of all sizes Zenoti powers more than 30,000 salons, spas, medspas and fitness studios in over 50 countries. This includes a vast portfolio of global brands, such as European Wax Center, Hand & Stone, Massage Heights, Rush Hair & Beauty, Sono Bello, Profile by Sanford, Hair Cuttery, CorePower Yoga and TONI&GUY. Our recent accomplishments include surpassing a $1 billion unicorn valuation, being named Next Tech Titan by GeekWire, raising an $80 million investment from TPG, ranking as the 316th fastest-growing company in North America on Deloitte's 2020 Technology Fast 500™. We are also proud to be recognized as a Great Place to Work CertifiedTM for 2021-2022 as this reaffirms our commitment to empowering people to feel good and find their greatness. To learn more about Zenoti visit: https://www.zenoti.com Our products are built on Windows .NET and SQL Server and managed in AWS. Our web Ux stack is built on jQuery, and we use AngularJS. Our middle tier is in C#, and we build our infrastructure on an extensive set of Restful APIs. We build native iOS and Android apps using Flutter and Dart. Our platform infrastructure is built in .NET Core and deployed on RHEL Enterprise Linux using Docker and Kubernetes. We use Python extensively for data processing workloads and Tableau for analytics dashboards for select infrastructure components. We use Redshift, Aurora, Redis Elasticache, Lambda, and other AWS and Azure products to build and manage our complete service, moving towards serverless components. We deal with billions of API calls, millions of records in databases, and terabytes of data to be managed with all services we build that have to run 24x7 at 99.99% availability. What will I be doing? Design, develop, test, release and maintain components of Zenoti Collaborate with a team of PM, DEV, and QA to release features Work in a team following agile development practices (SCRUM) Build usable software, released at high quality, runs at scale and is adopted by customers Learn to scale your features to handle 2x ~ 4x growth every year and manage code that has to deal with millions of records, and terabytes of data Release new features into production every month, and get real feedback from thousands of customers to refine your designs What skills do I need? Knowledge in designing and developing applications on the Microsoft stack Strong knowledge in building web applications Strong knowledge in HTML, JavaScript, CSS, jQuery, .NET/IIS with C# Proficiency in Microsoft SQL Server Knowledge in developing web applications using Angular/Flutter/Dart a plus Strong logical, analytical, and problem-solving skills Excellent communication skills Can work in a fast-paced, ever-changing, startup environment Why Zenoti? Be part of an innovative company that is revolutionizing the wellness and beauty industry. Work with a dynamic and diverse team that values collaboration, creativity, and growth. Opportunity to lead impactful projects and help shape the global success of Zenoti's platform. Attractive compensation. Medical coverage for yourself and your immediate family. Access to regular yoga, meditation, breathwork, and stress management sessions. We also include your family in benefit awareness initiatives. Regular social activities, and opportunities to give back through social work and community initiatives. Zenoti provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.
Posted 1 week ago
2.0 - 4.0 years
6 - 9 Lacs
Hyderābād
On-site
Senior Specialist, IT Service Operations Hyderabad, India; Ahmedabad, India; Gurgaon, India Information Technology 315774 Job Description About The Role: Grade Level (for internal use): 09 The Team: S&P Global’s Brokerage, Research, Sales, and Trading group (BRS&T) provides market intelligence, data and technology solutions to all participants in the Global Markets. We're seeking a talented and highly motivated Engineer to help us 2nd line Application Support. The Impact: This position is essential for the support of our business and providing our clients with a best-in-class customer experience. This position will also offer employees an opportunity to help build a cloud native support program from the ground up. What’s in it for you: Solve interesting technical challenges in the areas of distributed high-performance computing for a high-available cloud environment. Close collaboration with Product and Technology leaders Regular opportunities for promotion and advancement Responsibilities: Apply strong technical skills and good business knowledge together with investigative techniques to identify and resolve issues efficiently and in a timely manner. Work collaboratively with development team as required for third line escalation. Implement and monitor system checks for early detection of potential problems and raise the appropriate service outage ticket to initiate the incident management process when needed. Drive and engage in disaster recovery processes for all products. Coordinate with product and delivery teams to ensure the App Support team is ready for new releases and engaged in early design of new enhancements Work on initiatives and continuous improvement process around proactive application health monitoring, reporting, and technical support. What We’re Looking For: University Graduate with bachelor’s degree in computer science or computer Engineering related degree, master’s degree is a plus 2-4 years of work experience in an Application Support role Must have fundamental working knowledge of Oracle, SQL and RDBMS – including database query plan analysis and monitoring Knowledge of operating systems most especially Windows and Linux is a must. Good shell scripting experience is a must. Ability to use python scripting is an advantage Must have fundamental knowledge of networking basics and topology. Strong working knowledge of AWS/CI-CD and some of its technologies such as Git, Elasticsearch, EC2, Redshift, etc is an advantage. Excellent listening, presentation and interpersonal skills Ability to communicate ideas in both technical and user-friendly languages Ideal candidate is a self-starter capable of working independently as well as contribute to the team’s requirements Be able to work flexible shift hours including weekends to meet work requirements and project deadlines About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 315774 Posted On: 2025-05-29 Location: Hyderabad, Telangana, India
Posted 1 week ago
0 years
3 - 9 Lacs
Hyderābād
On-site
Business Technology Analyst – Global Employer Services Technology Center Deloitte Tax Services India Private Limited (“Deloitte Tax in India”) commenced operations in June 2004. Since then, nearly all of the Deloitte Tax LLP (“Deloitte Tax”) U.S. service lines and regions have obtained support services through Deloitte Tax in India. We provide support through the tax transformation taking place in the marketplace. We offer a broad range of fully integrated tax services by combining technology and tax technical resources to uncover insights and smarter solutions for navigating an increasingly complex global environment. We provide opportunities to transform tax operations using contemporary technologies in the market. Individuals work to transform their current state of tax to the next generation of tax functions. Are you ready to take the next step in your career to find new methods and processes to assist clients in improving their tax operations using new technologies? If the answer is “Yes,” come join Global Employer Services Technology Center (GESTC) The Team Organizations today are faced with an increasingly complex global talent landscape. The workforce is more agile, diversified and on demand, leading organizations to re-evaluate their talent models and how they deploy teams globally. An ever-changing geo-political landscape and new tax digital strategies create opportunities for Deloitte to ensure we provide innovative solutions to keep our clients compliant. Global Employer Services (GES) is a market leading ~USD 1.3 billion business with a prestigious client portfolio delivering mobility, reward and compliance services enabled through technology solutions. We are offering a unique opportunity to join our GES Technology team of ~200 professionals worldwide. This high performing, successful team creates innovative new technology products to enable GES services where you will have the platform to drive, influence and contribute to the success of our business. Job purpose: The Data Analytics application Developer (SQL, SSIS) is responsible for partnering with Customers and the teams that achieve the goals of clients/customers. You will be working with cutting edge technology, database, and visualization via dashboards. The important skills for this position are SQL, Microsoft SSIS, data extraction, data modeling, data transformation, and DBA skills. The successful candidate will have a high level of attention to detail, the ability to execute and deliver project deliverables on budget and on time, and multi-task in a dynamic environment. This position requires significant customer contact, and you must possess excellent communication, consulting, critical thinking, quantitative analysis, and probing skills to effectively manage client expectations. Applicants should be able to function in a close team environment and communicate within the team. You will also be responsible for managing a team, assigning work, and reporting work status back to the Product team. Key job responsibilities: Developing and maintaining reporting and analytical tools, including dashboards Working with several large, complex SQL databases Experience working in SSRS and writing complex stored procedures Knowledge of Bold reports will be advantageous Experience working on Redshift and Aurora will be beneficial Wrangling data from multiple sources create integrated views that can be used to drive decision making Participating in the design and execution of qualitative or quantitative analyses to help clients with relevant insights Partnering with the technology teams to deliver a robust reporting platform Working with business owners to identify information needs and develop reports/dashboards Performing unit and system level testing on applications Experience managing a team Setting tasks for the team Reporting work status to Product team Reviewing reports developed by other team members Education/Background: BTech/BSc in computer science or information technology Key skills desired 3 to 5 experience working on SSRS Strong knowledge of relational databases such as SQL Server, Oracle Good to have knowledge on any analytics tool (QlikView, QlikSense, Tableau) Knowledge of HTML, XML, JSON, Postman, REST API, MS Excel is a plus. Ability to develop large scale web/database applications Ability to simultaneously work on multiple projects effectively Ability to communicate clearly with business users and project manager Ability to innovate and provide functional applications with intuitive interfaces Ability to interact with individuals at all levels of the organization Ability to share knowledge and work effectively in a team Consistently meet client expectations and project deadlines Good interpersonal, organizational skills Strong commitment to client service excellence Work Location: Hyderabad Shift Timings: 11:00 AM to 8:00 PM || 2:00 PM to 11:00 PM #CA-GSD Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 304050
Posted 1 week ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description We are looking for motivated BIE to join the Amazon Global Cross Border Seller Programs team. The BIE will work side-by-side with the Product and Business team members as they support the team driving success and growth of Cross Border Seller Programs. This is an exciting opportunity to develop analytical insights and work directly with business and product leaders to shape Cross Border Strategy. Amazon Global is pioneering cross-border e-Commerce by becoming a one-stop shop for Amazon Customers Worldwide and providing a seamless experience to shop international selection. XB Seller Experience is a key driver for the growth for this business and we are looking for a dynamic, organized and self-starting BIE candidate to take on a critical role in developing cross border Seller reporting and analytics to deliver key insights to teams around the work for one of Amazon’s fastest growing businesses. Our team is a cross section of product, program, and tech leaders who own the end-to-end strategy and execution of improving the XB seller experience to increase global selection offering for worldwide customers. The optimal candidate will owns the reporting and analytics for Seller and selection growth across 45+ countries deep diving to develop insights that help drive our strategy. Our team works with product, tech, legal, ML, UX, and business teams across the world to deliver unique selection that delights our customers and Sellers around the world. As a Business Intelligence Engineer, you will work with our Business and Product Teams to create automated and scalable analytical tools that will ultimately be critical in driving the growth and expansion of the business. This includes building the infrastructure for regular reporting and diving deep to uncover insights that will impact business and product decisions. You will leverage data to provide key data-driven insights, enable new analyses, and innovate on behalf of our business customers. Basic Qualifications 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Bachelor's degree in BI, finance, engineering, statistics, computer science, mathematics, finance or equivalent quantitative field Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Master's degree in BI, finance, engineering, statistics, computer science, mathematics, finance or equivalent quantitative field Experience developing and presenting recommendations of new metrics allowing better understanding of the performance of the business Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2965719 Show more Show less
Posted 1 week ago
0 years
0 Lacs
Hyderābād
On-site
Business Technology Analyst – Global Employer Services Technology Center Deloitte Tax Services India Private Limited (“Deloitte Tax in India”) commenced operations in June 2004. Since then, nearly all of the Deloitte Tax LLP (“Deloitte Tax”) U.S. service lines and regions have obtained support services through Deloitte Tax in India. We provide support through the tax transformation taking place in the marketplace. We offer a broad range of fully integrated tax services by combining technology and tax technical resources to uncover insights and smarter solutions for navigating an increasingly complex global environment. We provide opportunities to transform tax operations using contemporary technologies in the market. Individuals work to transform their current state of tax to the next generation of tax functions. Are you ready to take the next step in your career to find new methods and processes to assist clients in improving their tax operations using new technologies? If the answer is “Yes,” come join Global Employer Services Technology Center (GESTC) The Team Organizations today are faced with an increasingly complex global talent landscape. The workforce is more agile, diversified and on demand, leading organizations to re-evaluate their talent models and how they deploy teams globally. An ever-changing geo-political landscape and new tax digital strategies create opportunities for Deloitte to ensure we provide innovative solutions to keep our clients compliant. Global Employer Services (GES) is a market leading ~USD 1.3 billion business with a prestigious client portfolio delivering mobility, reward and compliance services enabled through technology solutions. We are offering a unique opportunity to join our GES Technology team of ~200 professionals worldwide. This high performing, successful team creates innovative new technology products to enable GES services where you will have the platform to drive, influence and contribute to the success of our business. Job purpose: The Data Analytics application Developer (SQL, SSIS) is responsible for partnering with Customers and the teams that achieve the goals of clients/customers. You will be working with cutting edge technology, database and visualization via dashboards. The important skills for this position are SQL, Microsoft SSIS, data extraction, data modeling, data transformation, and DBA skills. The successful candidate will have a high level of attention to detail, the ability to execute and deliver project deliverables on budget and on time, and multi-task in a dynamic environment. This position requires significant customer contact and you must possess excellent communication, consulting, critical thinking, quantitative analysis and probing skills to effectively manage client expectations. Applicants should be able to function in a close team environment and communicate within the team. Key job responsibilities: Developing and maintaining reporting and analytical tools, including dashboards Working with several large, complex SQL databases Experience working in SSRS and writing complex stored procedures Knowledge of Bold reports will be advantageous Experience working on Redshift and Aurora will be beneficial Wrangling data from multiple sources create integrated views that can be used to drive decision making Participating in the design and execution of qualitative or quantitative analyses to help clients with relevant insights Partnering with the technology teams to deliver a robust reporting platform Working with business owners to identify information needs and develop reports/dashboards Performing unit and system level testing on applications Education/Background: BTech/BSc in computer science or information technology Key skills desired 2 to 3 experience working on SSRS Strong knowledge of relational databases such as SQL Server, Oracle Good to have knowledge on any analytics tool (QlikView, QlikSense, Tableau) Knowledge of HTML, XML, JSON, Postman, REST API, MS Excel is a plus. Ability to develop large scale web/database applications Ability to simultaneously work on multiple projects effectively Ability to communicate clearly with business users and project manager Ability to innovate and provide functional applications with intuitive interfaces Ability to interact with individuals at all levels of the organization Ability to share knowledge and work effectively in a team Consistently meet client expectations and project deadlines Good interpersonal, organizational skills Strong commitment to client service excellence Work Location: Hyderabad Shift Timings: 11:00 AM to 8:00 PM || 2:00 PM to 11:00 PM #CA-GSD #CA-HPN Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 304048
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Us Yubi stands for ubiquitous. But Yubi will also stand for transparency, collaboration, and the power of possibility. From being a disruptor in India’s debt market to marching towards global corporate markets from one product to one holistic product suite with seven products Yubi is the place to unleash potential. Freedom, not fear. Avenues, not roadblocks. Opportunity, not obstacles. About Yubi Yubi, formerly known as CredAvenue, is re-defining global debt markets by freeing the flow of finance between borrowers, lenders, and investors. We are the world's possibility platform for the discovery, investment, fulfillment, and collection of any debt solution. At Yubi, opportunities are plenty and we equip you with tools to seize it. In March 2022, we became India's fastest fintech and most impactful startup to join the unicorn club with a Series B fundraising round of $137 million. In 2020, we began our journey with a vision of transforming and deepening the global institutional debt market through technology. Our two-sided debt marketplace helps institutional and HNI investors find the widest network of corporate borrowers and debt products on one side and helps corporates to discover investors and access debt capital efficiently on the other side. Switching between platforms is easy, which means investors can lend, invest and trade bonds - all in one place. All of our platforms shake up the traditional debt ecosystem and offer new ways of digital finance. Yubi Credit Marketplace - With the largest selection of lenders on one platform, our credit marketplace helps enterprises partner with lenders of their choice for any and all capital requirements. Yubi Invest - Fixed income securities platform for wealth managers & financial advisors to channel client investments in fixed income Financial Services Platform - Designed for financial institutions to manage co-lending partnerships & asset based securitization Spocto - Debt recovery & risk mitigation platform Corpository - Dedicated SaaS solutions platform powered by Decision-grade data, Analytics, Pattern Identifications, Early Warning Signals and Predictions to Lenders, Investors and Business Enterprises So far, we have on-boarded over 17000+ enterprises, 6200+ investors & lenders and have facilitated debt volumes of over INR 1,40,000 crore. Backed by marquee investors like Insight Partners, B Capital Group, Dragoneer, Sequoia Capital, LightSpeed and Lightrock, we are the only-of-its-kind debt platform globally, revolutionizing the segment. At Yubi, People are at the core of the business and our most valuable assets. Yubi is constantly growing, with 1000+ like-minded individuals today, who are changing the way people perceive debt. We are a fun bunch who are highly motivated and driven to create a purposeful impact. Come, join the club to be a part of our epic growth story. About The Role As a Senior DevOps Engineer, you will be part of a highly talented DevOps team who manages the entire infrastructure for Yubi. You will work with development teams to understand their requirements, optimize them to reduce costs, create scripts for creating and configuring them, maintain and monitor the infrastructure. As a financial services firm, security is of utmost concern to our firm and you will ensure that all data handled by the entire platform, key configurations, passwords etc. are secure from leaks. You will ensure that the platform is scaled to meet our user needs and optimally performing at all times and our users get a world class experience using our software products. You will ensure that data, source code and configurations are adequately backed up and prevent loss of data. You will be well versed in tools to automate all such DevOps tasks. Responsibilities Troubleshoot web and backend applications and issues. Good understanding on multi-tier applications. Knowledge on AWS security, Application security, security best practices. SCA analysis, analyzing the security reports, sonarqube profiles and gates. Able to draft solutions to improve security based on reporting. Lead, drive and implement highly scalable, highly available and complex solutions. Up to date with latest devops tools and ecosystem. Excellent written and verbal communication. Requirements Bachelor’s/Master’s degree in Computer Science or equivalent work experience 3-6 years of working experience as DevOps engineer AWS Cloud expertise is a must and primary. Azure/GCP cloud knowledge is a plus. Extensive knowledge and experience with major AWS Services. Advanced AWS networking setup, routing, vpn, cross account networking, use of proxies. Experience with AWS multi account infrastructure. Infrastructure as code using cloudformation or terraform. Containerization – docker/kubernetes/ecs/fargate. Configuration management tools such as chef/ansible/salt. CI/CD - Jenkins/Code pipeline/Code deploy. Basic expertise in scripting languages such as shell, python or nodejs. Adept at Continuous Integration/Continuous Deployment Experience working with source code repos like Gitlab, Github or Bitbucket. Monitoring tools: cloudwatch agent, prometheus, grafana, newrelic, Dynatrace, datadog, openapm..etc ELK knowledge is a plus. Knowledge on chat-ops. Adept at using various operating systems, Windows, Mac and Linux Expertise in using command line tools, AWS CLI, Git, or other programming aws apis. Experience with both sql (rds postgres/mysql) and no-sql databases (mongo), data warehousing (redshift), datalake. Knowledge and experience in instrumentation, metrics and monitoring concepts. Benefits YUBI is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, or age. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description Amazon Corporate Security's (ACS) Business Assurance Center (BAC) is seeking a Business Intelligence Engineer who will help drive data-driven decision making through development of metrics, dashboards, and analytical solutions. This role will be instrumental in understanding customer behavior, designing new models to connect disparate data sources, analyzing trends, and delivering actionable insights for business decision-making. Key job responsibilities Develop data visualizations & dashboard designs that meet a wide range of customer needs and raise the bar for analytical excellence Work cross-functionally with stakeholders to design and implement ETL pipelines Provide technical mentorship and guidance to engineers and analysts, including code reviews, architectural guidance, and career development support Dive deep into varied data sources to understand relationships between metrics, surface insights, and implement new data models Three or more (3+) years of working cross functionally with tech and non-tech teams experience Use statistical models with large, multidimensional datasets to uncover trends, patterns, and opportunities Identify and recommend opportunities to automate processes Perform ad hoc analysis to quickly solve time-sensitive operational issues Clearly communicate discrepancies and findings, including root cause analysis and resolution steps to a broad user base A day in the life In a typical day, a BAC Business Intelligence Engineer might work to establish new metrics for security operations, analyze trends in security data, or develop dashboards for tracking key performance indicators. You'll prototype solutions for technical review by peers, implement ETL pipelines, and work independently to root cause data anomalies. You'll be expected to understand business implications and recommend courses of action through crisp documentation for senior leaders. About The Team Our team is dedicated to supporting new members. We are committed to building an environment that celebrates knowledge sharing and mentorship. We care about your career growth as a passionate learner who is motivated to take on challenges. We value finding work-life harmony while respecting our 24/7/365 remit, and work collectively to ensure team members can cultivate a balance that supports a productive and well-balanced life. Our work to ensure the safety and security of Amazon employees is serious; our team values humor and embodies mutual respect to foster creativity and innovation to meet our mission. Basic Qualifications 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A2944212 Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description Come be a part of a rapidly expanding $35 billion-dollar global business. At Amazon Business, a fast-growing startup passionate about building solutions, we set out every day to innovate and disrupt the status quo. We stand at the intersection of tech & retail in the B2B space developing innovative purchasing and procurement solutions to help businesses and organizations thrive. At Amazon Business, we strive to be the most recognized and preferred strategic partner for smart business buying. Bring your insight, imagination and a healthy disregard for the impossible. Join us in building and celebrating the value of Amazon Business to buyers and sellers of all sizes and industries. Unlock your career potential. As a BIE, you'll uncover business insights using product data that will help us make strategic decisions both for short term priorities and long term investments. You'll be working with the tech leaders to identify gaps in our data practices and help address them. You will partner with PMs, Solution Architects, TPMs and other stakeholders to run data analysis for some of the challenging customer problems. Key job responsibilities Develop and lead design and execution of reporting solutions for product data Work with product managers, data engineers, data scientists, and software engineers to define reporting needs Build self-service tools and dashboard that not only gives aggregated metrics but also provide ways to pull granular data for defect verification, customer communication, deep-dives, etc. Perform deep-dives and data analysis. Provide recommendation on new metrics or business decision. Mentor teammates, provide inputs in team and org level strategic planning, lead cross org discussions Internal Job Description To apply for this role, candidates must initiate an informational meeting with the hiring manager by clicking the "Request Informational" button at the top of the job listing.. For more on informational discussions, visit: https://ivy-help-center.talent.a2z.com/article/2gLzMFp5I3TMfchHDv03w2?ref=share-button To understand the internal transfer process and frequently asked questions, refer to: https://ivy-help-center.talent.a2z.com/article/5TdnwN6zJXjHvREhjtTdAJ?ref=share-button Visit our Internal Candidate Resource for fast answers to frequently asked questions (FAQs) from pre-application to pre-offer stages of the internal transfer process within My HR. Note, this resource is currently accessible for WWAS employees only. Internal Candidate Resource Center: https://atoz.amazon.work/myhr/category/my_employment/type/internal_transfers Basic Qualifications 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ - H84 Job ID: A2945610 Show more Show less
Posted 1 week ago
5.0 years
4 - 9 Lacs
Bengaluru
On-site
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ͏ Responsibilities: Design and implement the data modeling, data ingestion and data processing for various datasets Design, develop and maintain ETL Framework for various new data source Develop data ingestion using AWS Glue/ EMR, data pipeline using PySpark, Python and Databricks. Build orchestration workflow using Airflow & databricks Job workflow Develop and execute adhoc data ingestion to support business analytics. Proactively interact with vendors for any questions and report the status accordingly Explore and evaluate the tools/service to support business requirement Ability to learn to create a data-driven culture and impactful data strategies. Aptitude towards learning new technologies and solving complex problem. Qualifications: Minimum of bachelor’s degree. Preferably in Computer Science, Information system, Information technology. Minimum 5 years of experience on cloud platforms such as AWS, Azure, GCP. Minimum 5 year of experience in Amazon Web Services like VPC, S3, EC2, Redshift, RDS, EMR, Athena, IAM, Glue, DMS, Data pipeline & API, Lambda, etc. Minimum of 5 years of experience in ETL and data engineering using Python, AWS Glue, AWS EMR /PySpark and Airflow for orchestration. Minimum 2 years of experience in Databricks including unity catalog, data engineering Job workflow orchestration and dashboard generation based on business requirements Minimum 5 years of experience in SQL, Python, and source control such as Bitbucket, CICD for code deployment. Experience in PostgreSQL, SQL Server, MySQL & Oracle databases. Experience in MPP such as AWS Redshift, AWS EMR, Databricks SQL warehouse & compute cluster. Experience in distributed programming with Python, Unix Scripting, MPP, RDBMS databases for data integration Experience building distributed high-performance systems using Spark/PySpark, AWS Glue and developing applications for loading/streaming data into Databricks SQL warehouse & Redshift. Experience in Agile methodology Proven skills to write technical specifications for data extraction and good quality code. Experience with big data processing techniques using Sqoop, Spark, hive is additional plus Experience in data visualization tools including PowerBI, Tableau. Nice to have experience in UI using Python Flask framework anglular ͏ ͏ ͏ Mandatory Skills: Python for Insights. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
2.0 years
5 - 9 Lacs
Bengaluru
On-site
Mission: At Databricks, we are on a mission to empower our customers to solve the world's toughest data problems by utilising the Data Intelligence platform. As a Scale Solution Engineer, you will play a critical role in advising Customers in their onboarding journey. You will directly work with customers to help them onboard and deploy Databricks in their Production environment. The impact you will have: You will ensure new customers have an excellent experience by providing them with technical assistance early in their journey. You will become an expert on the Databricks Platform and guide customers in making the best technical decisions to achieve their goals. You will work on multiple tactical customers to track and report their progress. What we look for: 2+ years of industry experience Early-career technical professional ideally in data-driven or cloud-based roles. Knowledge of at least one of the public cloud platforms AWS, Azure, or GCP is required. Knowledge of a programming language - Python, Scala, or SQL Knowledge of end-to-end data analytics workflow Hands-on professional or academic experience in one or more of the following: Data Engineering technologies (e.g., ETL, DBT, Spark, Airflow) Data Warehousing technologies (e.g., SQL, Stored Procedures, Redshift, Snowflake) Excellent time management & presentation skills Bonus - Knowledge of Data Science and Machine Learning (e.g., build and deploy ML Models)
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
On-site
Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes Optimize data flow and collection for cross-functional teams Build infrastructure required for optimal extraction, transformation, and loading of data Ensure data quality, reliability, and integrity across all data systems Collaborate with data scientists and analysts to help implement models and algorithms Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc. Create and maintain comprehensive technical documentation Evaluate and integrate new data management technologies and tools Requirements 3-5 years of professional experience in data engineering roles Bachelor's degree in Computer Science, Engineering, or related field; Master's degree preferred Job Description Expert knowledge of SQL and experience with relational databases (e.g., PostgreSQL, Redshift, TIDB, MySQL, Oracle, Teradata) Extensive experience with big data technologies (e.g., Hadoop, Spark, Hive, Flink) Proficiency in at least one programming language such as Python, Java, or Scala Experience with data modeling, data warehousing, and building ETL pipelines Strong knowledge of data pipeline and workflow management tools (e.g., Airflow, Luigi, NiFi) Experience with cloud platforms (AWS, Azure, or GCP) and their data services. AWS Preferred Hands on Experience with building streaming pipelines with flink, Kafka, Kinesis. Flink Understanding of data governance and data security principles Experience with version control systems (e.g., Git) and CI/CD practices Preferred Skills Experience with containerization and orchestration tools (Docker, Kubernetes) Basic knowledge of machine learning workflows and MLOps Experience with NoSQL databases (MongoDB, Cassandra, etc.) Familiarity with data visualization tools (Tableau, Power BI, etc.) Experience with real-time data processing Knowledge of data governance frameworks and compliance requirements (GDPR, CCPA, etc.) Experience with infrastructure-as-code tools (Terraform, CloudFormation) Personal Qualities Strong problem-solving skills and attention to detail Excellent communication skills, both written and verbal Ability to work independently and as part of a team Proactive approach to identifying and solving problems
Posted 1 week ago
5.0 years
3 - 5 Lacs
Bengaluru
On-site
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Python+AWS Backend developer Role Role : A skilled Backend Developer with strong expertise in Python and experience working with AWS Cloud environments. Would be responsible for building scalable backend systems and APIs using modern cloud native approaches. Familiarity with AI/ML services such as Bedrock is a plus Key Responsibilities: Design , implement and maintain scalable backend solutions using Python and AWS Services Develop and maintain RESTful APIs and microservices to support applications and integrations Work with serverless (AWS Lambda) and /or containerized architectures Use DynamoDB, RDS, S3, Redshit , APIGateway and CloudWatch as part of application development and development Ensure backend systems are robust, secure and meet the performance expectations Write clean , maintainable code with appropriate testing and documentation Collaborate closely with frontend, Devops and data engineering teams Support CI/CD pipelines and infrastructure as code using tools like CloudFormation Required Qualifications 5+ years of experience in backend development using Python Strong knowledge of core AWS Services such as Lambda, DynamoDB, S3, IAM , Redshift, CloudWatch, APIGateway, Notifications Experience building and consuming RESTful APIs Familiarity with microservices and comfortable working with Git and CI/CD tools Solid understanding of cloud security , performance, optimization and monitoring Nice to Have Skills Exposure to AWS Bedrock or other generative AI/LLM services Knowledge of FastAPI, asyncio or similar Python frameworks Familiarity with observability and monitoring tools Understanding of Infrastructure as Code tools Mandatory Skills: Python Application Programming. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 week ago
0 years
4 - 7 Lacs
Chennai
On-site
Job Description: Lead AWS Data Engineer Job Location : Hyderabad / Bangalore / Chennai / Noida/ Gurgaon / Pune / Indore / Mumbai/ Kolkata We are seeking a skilled Lead AWS Data Engineer with strong programming and SQL skills to join our team. The ideal candidate will have hands-on experience with AWS Data Analytics services and a basic understanding of general AWS services. Additionally, prior experience with Oracle and Postgres databases and secondary skills in Python and Azure DevOps will be an advantage. Key Responsibilities: Design, develop, and optimize data pipelines using AWS Data Analytics services such as RDS, DMS, Glue, Lambda, Redshift, and Athena . Implement data migration and transformation processes using AWS DMS and Glue . Work with SQL (Oracle & Postgres) to query, manipulate, and analyse large datasets. Develop and maintain ETL/ELT workflows for data ingestion and transformation. Utilize AWS services like S3, IAM, CloudWatch, and VPC to ensure secure and efficient data operations. Write clean and efficient Python scripts for automation and data processing. Collaborate with DevOps teams using Azure DevOps for CI/CD pipelines and infrastructure management. Monitor and troubleshoot data workflows to ensure high availability and performance. Preferred Qualifications: AWS certifications in Data Analytics, Solutions Architect, or DevOps. Experience with data warehousing concepts and data lake implementations. Hands-on experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for redshift professionals in India is growing rapidly as more companies adopt cloud data warehousing solutions. Redshift, a powerful data warehouse service provided by Amazon Web Services, is in high demand due to its scalability, performance, and cost-effectiveness. Job seekers with expertise in redshift can find a plethora of opportunities in various industries across the country.
The average salary range for redshift professionals in India varies based on experience and location. Entry-level positions can expect a salary in the range of INR 6-10 lakhs per annum, while experienced professionals can earn upwards of INR 20 lakhs per annum.
In the field of redshift, a typical career path may include roles such as: - Junior Developer - Data Engineer - Senior Data Engineer - Tech Lead - Data Architect
Apart from expertise in redshift, proficiency in the following skills can be beneficial: - SQL - ETL Tools - Data Modeling - Cloud Computing (AWS) - Python/R Programming
As the demand for redshift professionals continues to rise in India, job seekers should focus on honing their skills and knowledge in this area to stay competitive in the job market. By preparing thoroughly and showcasing their expertise, candidates can secure rewarding opportunities in this fast-growing field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.