Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Summary Job Description Analyze business problems to be solved with automated systems. Provide technical expertise in identifying systems that are cost-effective and meet user requirements. Configure system settings and options; plans and build unit, integration and acceptance testing; and create specifications for systems to meet our requirements. Design details of automated systems. Provide consultation to users around automated systems. You will report to the IT Engineering Manager and work in a hybrid capacity from our Hinjewadi-Pune, India office. Your Responsibilities Analyze the existing data from SAP and extract insights to provide smart decisions working with developers on the E commerce Project Prepare new products for the SAP by establishing linkages to taxonomy, classification system, images, documentation, and drawings Publish new products to the online catalogue Monitor SAP data quality and completeness .Maintain SAP data Accomplish the SAP translation process and Build SAP enrichment/improvement projects Monitor and support data integrations The Essentials - You Will Have Bachelor's Degree in computer science, management information systems, engineering, or related field Experience with Data Setup Experience working with External Data sources: Establishing processes to load data , 24x7 site maintence e.g. tax, product availability, pricing Migration: Develop tools to migrate transactional data from old to new systems. Experience in Export/Reporting: Established processes to extract/transfer data to other systems and data layers e.g. ROKFusion (wrt Rockwell), Other similar Systems and Tools The Preferred - You Might Also Have Working knowledge of a broad range of industrial automation products Familiarity with ERP material master data concepts, including configuration Maintain data in the context of SAP systems Ability to new technologies and changing our requirements Work with multiple partners and influence project decisions Temperament To and assist colleagues through change and support change management processes Adapt to competing demands and IPC - Information Processing Capability (Factors of Complexity) Ability to work on issues of moderate scope where analysis of situations or data requires a review of relevant factors Exercise judgement within defined practices to determine appropriate action Apply process improvements to facilitate improved outcomes Implement processes across business/function to achieve assigned goals Distil information from different data sources and the capability to tell the "story" behind it, and recommendations for next steps Accepts Role Requirements What We Offer Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office.
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: Database Engineer Location: Pune, India Corporate Title: AVP Role Description We are looking for a talented and experienced software developer with strong technical expertise in SQL Server and .NET technologies. The ideal candidate will have a deep understanding of software development principles and demonstrate excellent problem-solving abilities. This role requires both technical proficiency and strong communication skills to collaborate effectively within a dynamic environment. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Maintain and develop our Risk platform (Risk Navigator), ensuring its efficiency and reliability, focusing on database (SQL Server) programming and optimizing Write clean, maintainable, and efficient code following industry best practices. Adhere to software development standards, ensuring modular, reusable, and well-documented solutions. Implement rigorous testing strategies, including unit tests, integration tests, and performance optimizations. Collaborate closely with the engineering team and stakeholders to ensure seamless integration of new features and solutions. Contribute to (and later work on) building a strategy to migrate the data into Google Cloud Your Skills And Experience Several years of experience in programming and data management. Strong proficiency in SQL Server. Deep understanding of clean code principles and design patterns. Exceptional team player with outstanding collaboration skills. Familiarity with SDLC tools, including Git and Jira. (Optional but highly beneficial:) Experience with SSIS and SSRS. (Optional but highly beneficial:) Experience with data management. (Optional but highly beneficial:) Experience in financial business, asset management, and/or risk management. Fluency in written and spoken English How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 2 weeks ago
0 years
3 - 6 Lacs
Gurgaon
On-site
Role - Cloud Database Engineer IV Skills - MS SQl + Cloud + Any secondary DB Shift - Should be ok for rotational - 24x7 Experience -Atleast 12 yrs of relevant exp. Job Profile Summary The Cloud Database Engineer Perform database engineering and administration activities including design, planning, configuration, monitoring, automation, self- serviceability, alerting, and space management, database backup and recovery. Plan computerized databases, including base definition, structure, documentation, long- range requirements, operational guidelines, and protection with a capacity to lead and advise on migration and modernization, discover, and execute workload migrations to Cloud (AWS/Azure/GCP). Key Responsibilities List Create, maintain, and use Standard Operating Procedures (SOP’s) for migration execution and ensure long term technical viability and optimization of production deployments and administration. Engage, Consult and Deliver based on interactive customer communications in streamlining project deliverables and scope of work. Capacity Planning: Forecast future database growth based on usage trends and plan for hardware and storage requirements accordingly to ensure scalability and optimal performance. Plan, Create, Manage and Deploy Effective High Availability and Disaster Recovery strategy/Runbooks Patch Management and Upgrades: Plan and execute Database software upgrades, patches, and service packs. Troubleshooting and Issue Resolution: Investigate and resolve complex database-related issues, including data corruption, performance problems, and connectivity challenges. Automation and Scripting: Contribute to automation scripts and tools to streamline repetitive tasks, improve efficiency, and reduce the risk of human errors. Monitoring and Alerting: Set up monitoring and alerting systems to proactively identify and address potential database issues before they become critical. Performance Analysis and Reporting: Generate performance reports and analysis for stakeholders and management to provide insights into the database environment's health and performance. Documentation: Maintain up-to-date documentation of database configurations, procedures, and troubleshooting steps Ticket Handling: Work to resolve Incident, Changes and Service request under the agreed client SLA. Problem Management: Responsible in resolving problem tickets by creating detailed RCA reports Participate in 24X7 production support for Database Operations. Hand’s on with using cloud tech tools such as AWS DMS, SMS, App Migration Service, Migration Hub, Azure Migrate, Data Migration Service, SQL Server DMA, Azure ASR, AWS DRS Migration from SQL server to/from Other RDBMS platform for PaaS models like AWS Aurora, AWS RDS, Azure Database, Azure MI, GCP Cloud SQL. Understanding Cloud basics and perform duties like security management, storage management, Backup Vaults, Key vaults, Server/DB Monitoring Cost Optimization: Compute and workload analysis, License enhancements and features. Knowledge List Proficient Skills in SQL Server Architecture, Installation and Configuration, Performance Tuning, High Availability and Disaster Recovery (HADR), Monitoring and Troubleshooting Database Migrations and Upgrades: Experience in planning and executing database migrations and upgrades, including version compatibility, testing, and minimizing downtime. Ability to Deploy, Manage and Troubleshoot HADR config in one of the following tech buckets SQL Server (Always On, FCI, Loshipping, Replication) MySQL or PostgreSQL( Master slave replication, InnoDB cluster Set) Homogeneous and Heterogeneous Migrations from/to between various Tech bucket (SQL Server and PostgreSQL or/ MySQL) SQL Server in the Cloud: Knowledge of deploying and managing SQL Server in cloud platforms such as Azure SQL Database and Amazon RDS. SQL Server Best Practices: Familiarity with industry best practices for SQL Server administration, including configuration settings, maintenance tasks, and disaster recovery strategies. Ability to communicate technical info and ideas so others will understand. Ability to apply varying leadership skills and traits that create solutions and results to unexpected situations. About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.
Posted 2 weeks ago
3.0 - 5.0 years
3 - 6 Lacs
India
Remote
Job Title: Shopify Developer (3–5 Years Experience) Location: [On-site] Employment Type: Full-Time Experience Required: 3 to 5 Years Industry: E-commerce / IT Services Address: SHILP AARON, 705, Sindhu Bhavan Marg, Opposite Armieda Complex, Bodakdev, Ahmedabad, Gujarat 380059 About the Company: Enstacked Technologies is a leading digital solutions company specializing in e-commerce development, UI/UX design, and digital marketing. We partner with global clients to create high-performing online stores and scalable Shopify solutions. Job Summary: We are looking for a talented Shopify Developer with 3 to 5 years of professional experience in developing and customizing Shopify stores. The ideal candidate should have a solid understanding of Shopify's ecosystem, including custom theme development, third-party integrations, and performance optimization. Key Responsibilities: Design, develop, and maintain responsive Shopify themes from scratch or by customizing existing ones. Implement mobile-first design and ensure cross-browser compatibility. Use Shopify's Liquid templating language to build dynamic storefront experiences. Integrate third-party APIs, Shopify apps, and custom apps as needed. Customize and optimize checkout processes, cart functionality, and product displays. Collaborate with designers and project managers to translate design mockups into working Shopify code. Conduct site audits and implement performance improvements and SEO enhancements. Manage version control using Git and follow best practices for code quality and deployment. Migrate sites to Shopify from other platforms (e.g., WooCommerce, Magento, BigCommerce). Stay up to date with the latest Shopify features, tools, and trends. Required Skills & Qualifications: 3 to 5 years of hands-on experience with Shopify development . Proficiency in Liquid , HTML5 , CSS3 , JavaScript , and jQuery . Experience with Shopify CLI , Shopify APIs, and app integration. Familiarity with design tools such as Figma , Adobe XD , or Sketch . Solid understanding of responsive design , web accessibility , and SEO fundamentals . Basic knowledge of Git, version control, and deployment processes. Understanding of third-party integrations like payment gateways, shipping, and CRM tools. Experience with Shopify Plus, metafields, custom schema, and section-based architecture is a plus. Preferred Skills (Good to Have): Experience developing custom Shopify apps using Node.js or PHP. Knowledge of headless Shopify or Hydrogen framework. Exposure to tools like Klaviyo, Google Tag Manager, and Google Analytics. Shopify Certification is an added advantage. Work Environment & Benefits: Competitive salary based on experience. Flexible working hours and remote work available in certain situations. Supportive and collaborative work culture. Professional development and upskilling opportunities. Job Type: Full-time Pay: ₹30,000.00 - ₹50,000.00 per month Application Question(s): What is your current CTC per year? What is your expected CTC per year? What is your notice period in days? Experience: Shopify Developer: 3 years (Required) Work Location: In person
Posted 2 weeks ago
5.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About us With our de-licious fresh meat and seafood, we are Licious —India’s leading D2C food-tech brand. Founded in 2015 by Abhay Hanjura and Vivek Gupta , we are headquartered in Bengaluru, delivering high-quality, fresh meat and seafood to over 32 lakh customers across 20 cities. Ready to add your unique flavor to Licious? Read on! Role Overview As a Senior Software Engineer – Backend at Licious , you will design, build, and scale high-performance backend systems, migrate applications to microservices, and drive best practices in architecture and coding. If this excites you, we want you on our team! Responsibilities Scope and create technical documents for reference and reporting. Define High-Level (HLD) and Low-Level Designs (LLD) to guide development. Build, own, and optimize high-scale modules from scratch, while reimagining existing ones for scalability. Migrate applications to a Microservices Architecture, ensuring reusable code and libraries. Review code, enforce best practices, and mentor SDE-I & SDE-II through continuous collaboration. Translate business requirements into scalable technical solutions. Implement DevOps best practices, contribute to CI/CD pipelines, and drive agile development. What makes you a magic ingredient to our recipe? Have 5-8 years of experience under your belt, preferably in a consumer product company. Must be proficient with Java Programming with a strong understanding of MySQL/NoSQL DevOps exposure with knowledge of CI/CD, Kubernetes basics, and Horizontal Pod Autoscaler. Experience in scalable system design, distributed systems, and design patterns. Hands-on experience with message brokers, caching, and writing unit test cases. Strong communication skills to articulate business and technical ideas effectively. BE/BTech or an equivalent degree. Location - Onsite, Bengaluru If you're excited to be part of Licious, apply through this Form
Posted 2 weeks ago
10.0 years
0 Lacs
Mulshi, Maharashtra, India
On-site
Area(s) of responsibility Oracle Cloud Technical Lead Responsibilities A minimum of 10 years’ experience as a Oracle Cloud technical development role Sound knowledge of Oracle SaaS cloud data migrations & inbound Integrations using File Based Data Import (FBDI), FBDI Automation using Oracle SOA Suite and Java, Inbound SOAP Web Services, Inbound REST APIs, ADFdi Spreadsheet Data Loader, Import File using UCM Web Service Hands on experience in Oracle Cloud Reporting Tools like BI Publisher (BIP), BIP Bursting, Secure BIP Report, Oracle Transactional Business Intelligence (OTBI), OTBI Analysis, Dashboards, Drill Down Report in OTBI, Use OTBI Analysis in BIP, Secure OTBI Report Working knowledge of ESS Jobs Submission and Scheduling, Create Custom ESS Job, Parametrize ESS Job, LOV with Lookup and Value Sets, Secure ESS Job Exposure to Extensions & Customizations - Sandboxes, Create Infolet, Customize Standard Fusion UI/Page, Integrate External Application, Application Composer, Migrate Customizations OIC Integration to import bulk data using FBDI is a plus Design, develop and support integrations in OIC to Oracle ERP Cloud including extracting Oracle ERP Cloud data using BI Publisher reports, analysis and OTBI Reports Provide hands-on technical and development support for implemented Oracle ERP Cloud modules Fusion Cloud Security experience like Security Console, Manage Users & Roles, Role Provisioning and Data Access Knowledge of Oracle Interface tables in financial and procurement modules. Hands-On Experience of XSLT
Posted 2 weeks ago
5.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
About Us We are an innovative tech team seeking talented full‑stack developers (2–5 years’ experience) eager to upskill and migrate into Salesforce technology. You'll leverage your strong background in Java, .NET, Python, PHP, Node, Angular, Zoho, low‑code platforms, etc., and rapidly train to become a valuable Salesforce professional. Key Responsibilities Work on both front-end and back-end architectures using full‑stack tools (e.g., Java/.NET/Python/Node/Angular/PHP). Collaborate closely with experienced Salesforce mentors to learn development on the Salesforce platform (Apex, Lightning, Visualforce). Participate in design, scripting, and deployment for Salesforce-based solutions under senior guidance. Assist in configuring flows, custom objects, integrations, and automations that align with business needs. Support data migration and integration projects between legacy systems and Salesforce. Learn and apply Salesforce development best practices, security standards, and design patterns. Engage in Agile ceremonies—daily stand-ups, sprint planning, code reviews, retrospectives. Prepare documentation, user guides, and release notes for Salesforce components. Required Qualifications 2–5 years professional experience as a full‑stack developer. Proficiency in at least one server‑side language (Java, .NET, PHP, Python, Node) with web frameworks. Strong front-end skills: HTML, CSS, JavaScript (Angular, React, Vue, etc.). Experience with databases: SQL or NoSQL. Familiarity with REST/SOAP APIs, version control (Git), and basic DevOps practices. Excellent problem-solving, analytical thinking, and communication skills. Eagerness to learn Salesforce: develop using Apex, Lightning Web Components, Visualforce, etc.
Posted 2 weeks ago
15.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Position: Snowflake Lead Experience Required: 15+ Location: Bengaluru, India Client: NYC & London-based financial services firm Key Responsibilities: Develop & optimize Snowflake objects (DBs, schemas, tables, views, procedures). Work on Snowflake features: Snow SQL, Snowpipe, Streams, Tasks, Time Travel, etc. Migrate data from Azure Cloud to Snowflake with performance optimization. Integrate ETL tools (Coalesce & Fivetran) for MS SQL data extraction & transformation. Write complex SQL queries, procedures, and functions using T-SQL. Implement masking, network policies, and role-based access in Snowflake. Collaborate with cross-functional teams in an agile environment. Requirements: 15+ years in IT, with 5+ years in Snowflake development. Strong experience with Snowpipe, Azure integration, and ETL processes. Hands-on with SQL Server, complex queries, and performance tuning. MBA in Finance is a must. Stable career track with no significant education or work gaps. Exposure to hedge funds, private debt, or private equity domains.
Posted 2 weeks ago
0.0 - 3.0 years
0 - 0 Lacs
Thaltej, Ahmedabad, Gujarat
Remote
Job Title: Shopify Developer (3–5 Years Experience) Location: [On-site] Employment Type: Full-Time Experience Required: 3 to 5 Years Industry: E-commerce / IT Services Address: SHILP AARON, 705, Sindhu Bhavan Marg, Opposite Armieda Complex, Bodakdev, Ahmedabad, Gujarat 380059 About the Company: Enstacked Technologies is a leading digital solutions company specializing in e-commerce development, UI/UX design, and digital marketing. We partner with global clients to create high-performing online stores and scalable Shopify solutions. Job Summary: We are looking for a talented Shopify Developer with 3 to 5 years of professional experience in developing and customizing Shopify stores. The ideal candidate should have a solid understanding of Shopify's ecosystem, including custom theme development, third-party integrations, and performance optimization. Key Responsibilities: Design, develop, and maintain responsive Shopify themes from scratch or by customizing existing ones. Implement mobile-first design and ensure cross-browser compatibility. Use Shopify's Liquid templating language to build dynamic storefront experiences. Integrate third-party APIs, Shopify apps, and custom apps as needed. Customize and optimize checkout processes, cart functionality, and product displays. Collaborate with designers and project managers to translate design mockups into working Shopify code. Conduct site audits and implement performance improvements and SEO enhancements. Manage version control using Git and follow best practices for code quality and deployment. Migrate sites to Shopify from other platforms (e.g., WooCommerce, Magento, BigCommerce). Stay up to date with the latest Shopify features, tools, and trends. Required Skills & Qualifications: 3 to 5 years of hands-on experience with Shopify development . Proficiency in Liquid , HTML5 , CSS3 , JavaScript , and jQuery . Experience with Shopify CLI , Shopify APIs, and app integration. Familiarity with design tools such as Figma , Adobe XD , or Sketch . Solid understanding of responsive design , web accessibility , and SEO fundamentals . Basic knowledge of Git, version control, and deployment processes. Understanding of third-party integrations like payment gateways, shipping, and CRM tools. Experience with Shopify Plus, metafields, custom schema, and section-based architecture is a plus. Preferred Skills (Good to Have): Experience developing custom Shopify apps using Node.js or PHP. Knowledge of headless Shopify or Hydrogen framework. Exposure to tools like Klaviyo, Google Tag Manager, and Google Analytics. Shopify Certification is an added advantage. Work Environment & Benefits: Competitive salary based on experience. Flexible working hours and remote work available in certain situations. Supportive and collaborative work culture. Professional development and upskilling opportunities. Job Type: Full-time Pay: ₹30,000.00 - ₹50,000.00 per month Application Question(s): What is your current CTC per year? What is your expected CTC per year? What is your notice period in days? Experience: Shopify Developer: 3 years (Required) Work Location: In person
Posted 2 weeks ago
7.5 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Agile Project Management Good to have skills : Apache Spark Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in ensuring that data is accessible, reliable, and ready for analysis, contributing to informed decision-making across the organization. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering practices. - Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Agile Project Management. - Good To Have Skills: Experience with Apache Spark, Google Cloud SQL, Python (Programming Language). - Strong understanding of data pipeline architecture and design principles. - Experience with ETL tools and data integration techniques. - Familiarity with data quality frameworks and best practices. Additional Information: - The candidate should have minimum 7.5 years of experience in Agile Project Management. - This position is based in Chennai (Mandatory). - A 15 years full time education is required.
Posted 2 weeks ago
3.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data Services Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary:As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and ModelorProject Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making.Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQLTechnical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environmentProfessional Attributes : 1: Must have good communication skills2: Must have ability to collaborate with different teams and suggest solutions3: Ability to work independently with little supervision or as a team4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor
Posted 2 weeks ago
10.0 - 12.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
About the role: We are seeking an experienced and innovative Director-IT Infra to lead our IT Infrastructure and IT Security teams. The ideal candidate will drive the management and strategic oversight of on premises datacenter assets, end user systems and on-cloud SaaS / PaaS / IaaS services with a focus on Microsoft and Open-Source technologies, while leading initiatives to transition workloads from current on-premises to hybrid cloud ecosystem. Leadership and Management: Work closely with CTO to define a strategic direction for organization IT ecosystem and align them to business objectives; digital transformation initiatives and “Right-Fit” technology. Formulate, strategize and implement IT and InfoSec policies aligning them to industry standards; best practices / guidelines and organization goals. Managing vendor / service provider relationships and run periodic cost optimization through vendor / tool consolidation and timely AMC negotiations / renewals. Develop and implement change management processes to ensure smooth transition and adoption of new technologies. Communicate changes to all stakeholders and provide necessary support during change. Lead the IT Infra team and IT InfoSec teams. Foster an innovation driven, collaborative, ever learning and high-performance team environment. IT Infrastructure Management: Lead initiatives to migrate on-premises workloads to Microsoft Azure and integrate Open Source tools like Docker and Kubernetes. Developing and executing strategies for the migration of data and applications to cloud-based infrastructure. Manage on-premises servers using Microsoft Windows Server and Linux Ubuntu running on virtualization platforms like VMware ESXi and Linux KVM. Ensure the reliability, availability. performance, security and high uptime of all IT assets, including hardware like Dell servers; HPE servers; SAN Data Storages; WAN / LAN Devices; EPABX systems etc. Develop and implement maintenance schedules using tools like Microsoft System Center. Setting up, managing and monitoring organization's datacenter operations. Oversee network architecture, connectivity uptime, and network performance using Cisco routers, switches, and other communication devices. Setup IT Infrastructure Monitoring Tools to identify and resolve IT infrastructure problems before they can adversely affect critical business processes. Report to management team insight into the status of physical, virtual, and cloud systems and help ensure availability and performance. Security and Compliance: Implement and manage security measures, including Next-Generation Firewalls; IDS / IPS; VPNs; Next-Generation Endpoint Security; DLP; IRM / EDRM; Web Proxy etc. Conduct regular security assessments at server level and network level using tools like Nessus, Nmap etc. to assess security implementation and mitigate vulnerabilities. Ensure compliance with security policies and procedures using SIEM solutions like Splunk and ensure zero data theft and data leakage. Monitor and respond to security incidents with solutions like Microsoft Defender for Cloud and Open-Source tools such as Wazuh, OSSEC etc. Ensure compliance with industry regulations and standards, maintaining certifications such as ISO 9001, ISO 27001, PCI DSS. Implement disaster recovery and business continuity plans based on best practices and industry standards using solutions such as Commvault, Borg, Veeam etc. Innovation and Improvement: Identify opportunities for technological improvements and innovation with a focus on Microsoft / Open-Source solutions and build blueprints to transition from older technology leading to reduce TCO and enhanced systems experience. Promote the adoption of emerging technologies and open-source tools to enhance business / IT operations. Setup key IT processes and capture data touchpoints to evaluate IT Teams performance and OKRs. Build a culture of continuous improvement and service excellence. Provide leadership to drive Infrastructure and Network Security maturity improvements across the organization, in line with the changing Threat Landscape, Regulatory and Compliance requirements etc. Rewire the current processes, practices and disciplines for IT Service Management using ITIL principles aligning IT services with the needs of the business Experience: At least 10-12 years of relevant experience in IT infrastructure management and information security. Must have proven experience leading and managing complex hybrid IT teams. Must have proven experience in leading initiatives to transition workloads from current on-premises to hybrid cloud ecosystem. Must have proven experience in implementing and managing IT Security, Business Continuity Plans, Disaster Recovery Frameworks and Security Audits. Must have technical proficiency and hands-on experience with Microsoft technologies (e.g., Windows Server, Azure Services, Microsoft 365, SharePoint etc.) and Open-Source technologies (e.g., Ubuntu Linux, KVM, Docker, Kubernetes etc.). Experience in managing datacenter operations, network systems and virtualization environments. Experience with IT process optimization and implementing change management processes. Any relevant industry certifications like CISSP, CISM, Azure Solutions Architect Expert, Red Hat Certified, Cisco Certified Network Professional etc. will be added advantage. Experience working in large publication company, management consulting company or Tier 1 startups will be added advantage.
Posted 2 weeks ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : Fulltime 15 years qualification Summary:As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and ModelorProject Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making.Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQLTechnical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environmentProfessional Attributes : 1: Must have good communication skills2: Must have ability to collaborate with different teams and suggest solutions3: Ability to work independently with little supervision or as a team4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor
Posted 2 weeks ago
14.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Overview Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Responsibilities About the Team: Cloud Migrations The main responsibility is to help customers successfully migrate to Cloud This role is based out of India (remote) location - will be looking after the Mid-market segment, Enterprise customers (1000 user respectively) from EMEA region, supporting higher volume Work hours are between 12pm/1pm - 9pm/10pm dependant on IST changes (EMEA shift) Migrating all Atlassian products to cloud The following articles/blogs hep understand Atlassian’s investment in the cloud space Delivering the best cloud experience for all teams - Work Life by Atlassian Skills and Attributes: ( Must have ) Customer success background - focusing on customer centricity Project Management Good written and verbal , communication and presentations skills Stakeholder management, mainly technical audiences Customer issues/escalation handling Technical aspect: Migration experience - cloud to cloud or server to cloud Between 9 to 13/14 years of experience Good technical acumen Good To Have Skills SMB/Enterprise customer handling background Atlassian Product Knowledge Experience in directly supporting the customers via call and/or email. Qualifications On the first day, we'll expect you to have: 8 to 12 years experience in a strategic customer-facing role within either customer support, customer success, a migrations-specific department, or other meaningful function. Project management experience and a natural propensity for public speaking and experience leading C-level conversations. Ask compelling/ leading questions, uncover common customer themes, and lead technical planning and project planning; Build presentations, write content, and present to large audiences. Experience navigating a SaaS working environment with DevOps or IT teams and completing large projects. Familiarity with translating customers technical requirements to meaningful asks from the product and engineering teams Broad experience working with Enterprise level customers, comfort in navigating a large organisation and confirmed past involvement in building relationships internally. Empathy for customer anxiety and experience helping customers deal with change management within their organisation; Strategic account management skills. And increase process and tools to reduce inefficiencies and scale the programme. Experience advancing potential customer challenges before they become full-blown issues and ability to partner with other teams to resolve and communicate needed information back to the customer. Benefits & Perks Atlassian offers a wide range of perks and benefits designed to support you, your family and to help you engage with your local community. Our offerings include health and wellbeing resources, paid volunteer days, and so much more. To learn more, visit go.atlassian.com/perksandbenefits . About Atlassian At Atlassian, we're motivated by a common goal: to unleash the potential of every team. Our software products help teams all over the planet and our solutions are designed for all types of work. Team collaboration through our tools makes what may be impossible alone, possible together. We believe that the unique contributions of all Atlassians create our success. To ensure that our products and culture continue to incorporate everyone's perspectives and experience, we never discriminate based on race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. All your information will be kept confidential according to EEO guidelines. To provide you the best experience, we can support with accommodations or adjustments at any stage of the recruitment process. Simply inform our Recruitment team during your conversation with them. To learn more about our culture and hiring process, visit go.atlassian.com/crh .
Posted 2 weeks ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. Those in cloud operations at PwC will focus on managing and optimising cloud infrastructure and services to enable seamless operations and high availability for clients. You will be responsible for monitoring, troubleshooting, and implementing industry leading practices for cloud-based systems. You are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt, take ownership and consistently deliver quality work that drives value for our clients and success as a team. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Azure Operation Analyst (Associate/Senior Associate) Demonstrates thorough abilities and/or a proven record of success as a team leader : Managing and support the Dev to Production cloud PaaS and platform, to establish quality, performance, and availability of hosted services; Providing guidance and support for cloud technology practitioners (Application Development team); Providing Implementation and Run & Maintain services; Working on high volume mission critical systems; Providing on call support for Production cloud Environments; Working hands-on with customers to develop, migrate, and debug services issues; Providing updated server/process documentation and as appropriate, creating documentation where none may exist; Focusing on rapid identification and resolution of customer issues; Answering questions and perform initial triage on problem reports; Providing first/second level cloud environment support; Working very closely with application users to troubleshoot and resolve cloud hosted applications or system issues; Informing Technical Support Management about any escalations or difficult situations that require his/her involvement; Providing Cloud customers with an industry leading customer experience when engaging Technical Support; Assisting in Tier 2 and 3 triage, troubleshooting, remediation, and escalation of tickets tied to the product support function; Training and supporting junior team members in resolving product support tickets; Proactively identifying ways to optimize the product support function; Coordinating to establish and manage clear escalation guidelines for supported system components; Running database queries to lookup, resolve, issues; Demonstrating proven communication and collaboration skills to coordinate with developers and application team to negotiate and schedule patching windows; Demonstrating experience in managing the monthly Windows or Linux environment patching. Must Have Qualifications Hands-on experience with Azure Web apps, App Insights, App Service Plan, App Gateway, API Management, Azure Monitor, KQL queries and other troubleshooting skills for all Azure PaaS & IaaS Services. Proven verbal and written communication skills, which will be key in driving customer communication during critical events Demonstrating proficiencies in at least one of the technology domains Networking Principles, System Administration, DevOps, Configuration Management and Continuous Integration Technologies (Chef, Puppet, Docker, Jenkins) Proven understanding of ITIL framework Good To Have Qualifications Interest in information security and a desire to learn techniques and technologies such as application security, cryptography, threat modeling, penetration testing
Posted 2 weeks ago
3.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data Services Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary:As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and ModelorProject Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making.Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQLTechnical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environmentProfessional Attributes : 1: Must have good communication skills2: Must have ability to collaborate with different teams and suggest solutions3: Ability to work independently with little supervision or as a team4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor
Posted 2 weeks ago
3.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor
Posted 2 weeks ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire MS SQL Professionals in the following areas : Experience 5 - 7 Years Job Description Job Summary: We are looking for a skilled SQL Server, Snowflake Developer to join our data and analytics team. The ideal candidate will have strong experience in developing and maintaining data solutions using SQL Server, Snowflake. You will play a key role in building scalable data pipelines, designing data models, and delivering business intelligence solutions. Key Responsibilities Develop and optimize complex SQL queries, stored procedures, and ETL processes in SQL Server. Design and implement data pipelines and models in Snowflake. Build and maintain SSIS packages for ETL workflows. Migrate and integrate data between on-premise SQL Server and Snowflake cloud platform. Collaborate with business analysts and stakeholders to understand reporting needs. Ensure data quality, performance tuning, and error handling across all solutions. Maintain technical documentation and support data governance initiatives. Required Skills & Qualifications 5-7 years of experience with SQL Server (T-SQL). 2+ years of hands-on experience with Snowflake. Strong understanding of ETL/ELT processes and data warehousing principles. Experience with data modeling, performance tuning, and data integration. Familiarity with Azure cloud platforms is a plus. Good communication and problem-solving skills. Preferred / Good-to-Have Skills Experience with Azure Data Factory (ADF) for orchestrating data workflows. Experience with Power BI or other visualization tools. Exposure to CI/CD pipelines and DevOps practices in data environments. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering And Analysis Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture Tools And Frameworks Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture Concepts And Principles Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Accountability Required Behavioral Competencies Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration Shares information within team, participates in team activities, asks questions to understand other points of view. Agility Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 2 weeks ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data Services Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary:As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and ModelorProject Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making.Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQLTechnical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environmentProfessional Attributes : 1: Must have good communication skills2: Must have ability to collaborate with different teams and suggest solutions3: Ability to work independently with little supervision or as a team4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor
Posted 2 weeks ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Role - Cloud Database Engineer IV Skills - MS SQl + Cloud + Any secondary DB Shift - Should be ok for rotational - 24x7 Experience -Atleast 12 yrs of relevant exp. Job Profile Summary The Cloud Database Engineer Perform database engineering and administration activities including design, planning, configuration, monitoring, automation, self- serviceability, alerting, and space management, database backup and recovery. Plan computerized databases, including base definition, structure, documentation, long- range requirements, operational guidelines, and protection with a capacity to lead and advise on migration and modernization, discover, and execute workload migrations to Cloud (AWS/Azure/GCP). Key Responsibilities List Create, maintain, and use Standard Operating Procedures (SOP’s) for migration execution and ensure long term technical viability and optimization of production deployments and administration Engage, Consult and Deliver based on interactive customer communications in streamlining project deliverables and scope of work Capacity Planning: Forecast future database growth based on usage trends and plan for hardware and storage requirements accordingly to ensure scalability and optimal performance Plan, Create, Manage and Deploy Effective High Availability and Disaster Recovery strategy/Runbooks Patch Management and Upgrades: Plan and execute Database software upgrades, patches, and service packs Troubleshooting and Issue Resolution: Investigate and resolve complex database-related issues, including data corruption, performance problems, and connectivity challenges Automation and Scripting: Contribute to automation scripts and tools to streamline repetitive tasks, improve efficiency, and reduce the risk of human errors Monitoring and Alerting: Set up monitoring and alerting systems to proactively identify and address potential database issues before they become critical Performance Analysis and Reporting: Generate performance reports and analysis for stakeholders and management to provide insights into the database environment's health and performance Documentation: Maintain up-to-date documentation of database configurations, procedures, and troubleshooting steps Ticket Handling: Work to resolve Incident, Changes and Service request under the agreed client SLA Problem Management: Responsible in resolving problem tickets by creating detailed RCA reports Participate in 24X7 production support for Database Operations Hand’s on with using cloud tech tools such as AWS DMS, SMS, App Migration Service, Migration Hub, Azure Migrate, Data Migration Service, SQL Server DMA, Azure ASR, AWS DRS Migration from SQL server to/from Other RDBMS platform for PaaS models like AWS Aurora, AWS RDS, Azure Database, Azure MI, GCP Cloud SQL Understanding Cloud basics and perform duties like security management, storage management, Backup Vaults, Key vaults, Server/DB Monitoring Cost Optimization: Compute and workload analysis, License enhancements and features Knowledge List Proficient Skills in SQL Server Architecture, Installation and Configuration, Performance Tuning, High Availability and Disaster Recovery (HADR), Monitoring and Troubleshooting Database Migrations and Upgrades: Experience in planning and executing database migrations and upgrades, including version compatibility, testing, and minimizing downtime Ability to Deploy, Manage and Troubleshoot HADR config in one of the following tech buckets SQL Server (Always On, FCI, Loshipping, Replication) MySQL or PostgreSQL( Master slave replication, InnoDB cluster Set) Homogeneous and Heterogeneous Migrations from/to between various Tech bucket (SQL Server and PostgreSQL or/ MySQL) SQL Server in the Cloud: Knowledge of deploying and managing SQL Server in cloud platforms such as Azure SQL Database and Amazon RDS SQL Server Best Practices: Familiarity with industry best practices for SQL Server administration, including configuration settings, maintenance tasks, and disaster recovery strategies Ability to communicate technical info and ideas so others will understand Ability to apply varying leadership skills and traits that create solutions and results to unexpected situations About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.
Posted 2 weeks ago
7.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company:IT Services Organization Key Skills: Java, Openshift, Kafka, Kubernetes. Roles & Responsibilities: Migrate applications from local server environments to GCP-based cloud architecture. Collaborate with compliance teams to ensure designs meet regulatory requirements. Monitor and tune performance of cloud environments. Share knowledge of alternative cloud environments and services. Identify and propose new work opportunities. Document and recommend best practices while articulating process improvements. Develop automation solutions, including data transfer automation using technologies such as SFTP. Manage day-to-day processes, documentation, KPIs, and reporting. Experience Requirement: 7 - 10 years of hands-on experience in Java development, including backend services and APIs. Strong experience working on Openshift, Kubernetes, and Kafka in cloud-native environments. Proven success in cloud migration projects, particularly moving applications to GCP. In-depth understanding of microservices architecture and container orchestration tools. Experience in collaborating with cross-functional teams and ensuring regulatory compliance in design. Demonstrated ability in building automation tools and optimizing cloud-based performance. Education: B.Tech.
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job description Job Name: Senior Data Engineer DBT & Snowflake Years of Experience: 5 Job Description: We are looking for a skilled and experienced DBT-Snowflake Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills: DBT,Snowflake Secondary Skills: ADF,Databricks,Python,Airflow,Fivetran,Glue Role Description: Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases. Role Responsibility: Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement: Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement: • Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Snowflake, ensure the effective transformation and load data from diverse sources into data warehouse or data lake. • Implement and manage data models in DBT, guarantee accurate data transformation and alignment with business needs. • Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting. • Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance. • Establish best DBT processes to improve performance, scalability, and reliability. • Expertise in SQL and a strong understanding of Data Warehouse concepts and Modern Data Architectures. • Familiarity with cloud-based platforms (e.g., AWS, Azure, GCP). • Migrate legacy transformation code into modular DBT data models
Posted 2 weeks ago
8.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job description: Job Description Role Purpose The purpose of the role is to create exceptional and detailed architectural application design and provide thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. ͏ Do 1. Develop architectural application for the new deals/ major change requests in existing deals a. Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. b. Manages application assets and directs the development efforts within an enterprise to improve solution delivery and agility c. Guides how to construct and assemble application components and services to support solution architecture and application development d. Maintains the frameworks and artefacts used in the implementation of an application, with reference to the systematic architecture of the overall application portfolio e. Responsible for application architecture paradigms such as service-oriented architecture (SOA) and, more specifically, microservices, ensuring business achieve agility and scalability for a faster time to market ͏ f. Provide solution of RFP’s received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration design framework/ architecture Depending on the client’s need with particular standards and technology stacks create complete RFPs Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Tracks industry and application trends and relates these to planning current and future IT needs g. Provides technical and strategic inputs during the project planning phase in the form of technical architectural designs and recommendations h. Account mining to find opportunities in the existing clients i. Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture. j. Identifies implementation risks and potential impacts. k. Create new revenue streams within applications as APIs that can be leveraged by clients l. Bring knowledge of automation in application by embracing Agile and dev-ops principles to reduce manual part ͏ 2. Understanding application requirements and design a standardize application a. Creating Intellectual Property in forms of services, patterns, models and organizational approaches b. Designing patterns, best practices and reusable applications that can be used for future references c. Ensure system capabilities are consumed by system components and set criteria for evaluating technical and business value in terms of Tolerate, Invest, Migrate and Eliminate d. Provide platform to create standardize tools, uniform design and techniques are maintained to reduce costs of maintenance e. Coordinating input on risks, costs and opportunities for concepts f. Developing customised applications for the customers aligned with their needs g. Perform design and code reviews thoroughly on regular basis, keeping in mind the security measures h. Understanding design and production procedures and standards to create prototypes and finished products i. Work closely with systems analysts, software developers, data managers and other team members to ensure successful production of application software j. Offer viable solutions for various systems and architectures to different types of businesses k. Seamless integration of new and existing systems to eliminate potential problems and maintain data structure and bring value in terms of development l. Transforming all applications into digital form and implement and evolve around mesh app and service architecture that support new technologies like IOT, blockchain, machine learning, automation, BOTS etc ͏ m. Cloud Transformation: (Migration) Understanding non-functional requirements Producing artefacts such as deployment architecture, interface catalogue Identify internal and external dependency, vendor and internal IT management Support build and testing team n. Cloud Transformation: (Modernization) Understanding and Defining target architecture in Integration space Assessing project pipeline / demand and align to target architecture Technical support of delivery team in terms and POC and technical guidance o. Keep Up-to-date with the latest technologies in the market Mandatory Skills: Fullstack Java Enterprise . Experience: 8-10 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 2 weeks ago
0.0 - 8.0 years
0 Lacs
Panchkula, Haryana
On-site
Description Job Description We’re looking for an experienced Senior/Lead Marketing Automation Specialist with deep expertise in Marketo to lead strategic automation initiatives, set up or migrate instances, and drive campaign performance at scale. The ideal candidate should be well-versed in lead lifecycle strategy, campaign optimization, and marketing data governance. If you’re someone who succeeds in a fast-paced environment and can advise both clients and internal stakeholders on best practices, we’d love to hear from you. Skills Key Skills 5–8 years of hands-on experience with Marketo. Strong experience in Marketo instance setup or migration. Proven ability to define and execute lead lifecycle strategies: scoring, routing, nurturing. Proficient in creating scalable campaign frameworks and reusable global templates. In-depth knowledge of segmentation, personalization, and engagement optimization. Experience in managing email deliverability, A/B testing, and performance analytics. Familiarity with data governance, privacy compliance, and deliverability standards. Strong documentation and process implementation skills. Ability to lead client calls, discovery sessions, and training workshops. Insight into AI trends and integration opportunities within marketing automation. Responsibilities Roles and Responsibilities Lead the setup or migration of Marketo instances from other platforms. Design and implement scalable lead lifecycle frameworks (scoring, routing, nurturing). Build reusable campaign templates and structures for enterprise-wide use. Manage end-to-end strategy and execution of global Marketo campaigns. Monitor and improve campaign performance using A/B testing and analytics. Enhance audience segmentation, personalization, and engagement strategies. Maintain optimal Marketo instance health: folder structure, asset hygiene, field management. Ensure data compliance, deliverability best practices, and privacy standards. Create and maintain SOPs, documentation, and naming conventions for internal teams. Conduct platform audits, develop automation roadmaps, and suggest enhancements. Guide AI feature adoption within Marketo and integrated tools. Act as a trusted consultant for internal and client stakeholders. Drive enablement sessions, training, and ongoing support to ensure platform success. Contacts Email: careers@grazitti.com Address: HSIIDC Technology Park, Plot No – 19, Sector 22, 134104, Panchkula, Haryana, India
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France