Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description The Role We are hiring a Senior Data Engineer to help design and scale the infrastructure behind our analytics, performance marketing, and experimentation platforms. This role is ideal for someone who thrives on solving complex data problems, enjoys owning systems end-to-end, and wants to work closely with stakeholders across product, marketing, and analytics. You’ll build reliable, scalable pipelines and models that support decision-making and automation at every level of the business. What you’ll do Build, maintain, and optimize data pipelines using Spark, Kafka, Airflow, and Python Orchestrate workflows across GCP (GCS, BigQuery, Composer) and AWS-based systems Model data using dbt, with an emphasis on quality, reuse, and documentation Ingest, clean, and normalize data from third-party sources such as Google Ads, Meta, Taboola, Outbrain, and Google Analytics Write high-performance SQL and support analytics and reporting teams in self-serve data access Monitor and improve data quality, lineage, and governance across critical workflows Collaborate with engineers, analysts, and business partners across the US, UK, and India What You Bring 4+ years of data engineering experience, ideally in a global, distributed team Strong Python development skills and experience Expert in SQL for data transformation, analysis, and debugging Deep knowledge of Airflow and orchestration best practices Proficient in DBT (data modeling, testing, release workflows) Experience with GCP (BigQuery, GCS, Composer); AWS familiarity is a plus Strong grasp of data governance, observability, and privacy standards Excellent written and verbal communication skills Nice to have Experience working with digital marketing and performance data, including: Google Ads, Meta (Facebook), TikTok, Taboola, Outbrain, Google Analytics (GA4) Familiarity with BI tools like Tableau or Looker Exposure to attribution models, media mix modeling, or A/B testing infrastructure Collaboration experience with data scientists or machine learning workflows Perks Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
New Delhi, Delhi, India
On-site
URGENT HIRING: Social Media & Digital Marketing Specialist Immediate Start Preferred Full-Time - Mon - Sat On-Site in our studio in New Delhi We're a creative-led studio working across film, photography, social campaigns, and design — and we’re looking for someone who thrives on creative, strategic and disciplined approach. This role blends creativity with performance — you should be equally confident creating engaging organic content and managing paid marketing across Meta and Google platforms. What You’ll Do: Build - Plan, Lead & Execute organic social media content strategy across Meta, Linkedin, Pinterest, Youtube Run , Manage, Optimize Paid Campaigns on Meta and Google Make Awesome Content - Collaborate across creative, design, production teams to bring ideas to life and maintain visual and brand consistency. Access to participate in diverse creative projects in the studio Grow the community and online presence via socials, emails and in person. Bring a sharp eye for design, words, trends, and tone Who we're looking for: 2–3 years of relevant experience Comfortable with multiple social platforms, email marketing tools Proven ability to manage both organic and paid social media Strong ownership mindset — you’re organised, detail-oriented, and self-driven Energised to thrive in a fast-moving, multi-disciplinary environment Why you’ll enjoy working here: Be part of a passionate, tight-knit team that values creativity, intent, and quality Opportunity to work and grow across multiple projects — we work across diverse industries. Lights, Camera & Action - You'll be part of our photoshoots and video shoots, helping to make sure we get the best for advertising/social media (big or small shoots, you'll be in the mix!) Experience working closely as if in a creative lab with spark and experimentation Location: New Delhi (On Site, Close to Malviya Nagar Metro) Compensation: 3,60L - 4,20L annually, basis experience and skill. To Apply: → Fill this short form: https://forms.gle/iMGF9NFLrBguwLeN7 → Send us your resume, portfolio, and a brief note about why you’d be a great fit - pointblankproductionshiring@gmail.com Show more Show less
Posted 2 days ago
3.0 - 8.0 years
9 - 15 Lacs
Gurugram, Bengaluru
Hybrid
NOTICE: Immediate - 15 Days Serving. PF Mandatory ! MANDATORY SKILLS: Snowflake Cloud (AWS/GCP) SCALA Python Spark -- Thanks & Regards, Karthik Kumar, IT Recruiter SP Software (P) Limited (An ISO, ISMS & CMMI Level-3 Certified company) An SP Group Enterprise. Connect on : linkedin.com/in/b-karthik-kumar-116990179
Posted 3 days ago
5.0 - 10.0 years
0 - 2 Lacs
Pune, Chennai
Work from Office
Mandatory Skill: 1. Spark 2. SQL 3. Python JD: Must Have: • Relevant experience of 5-8yrs as a Data Engineer. • Preferred experience in related technologies as follows: • SQL: 2-4 years of experience • Spark: 1-2 years of experience • NoSQL Databases: 1-2 years of experience • Database Architecture: 2-3 years of experience • Cloud Architecture: 1-2 years of experience • Experience in programming language like Python • Good Understanding of ETL (Extract, Transform, Load) concepts • Good analytical and problem-solving skills • Inclination for learning & be self-motivated. • Knowledge of ticketing tool like JIRA/SNOW. • Good communication skills to interact with Customers on issues & requirements. Good to Have: • Knowledge/Experience in Scala.
Posted 3 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Req ID: 327890 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Python Developer - Digital Engineering Sr. Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). PYTHON Data Engineer Exposure to retrieval-augmented generation (RAG) systems and vector databases. Strong programming skills in Python (and optionally Scala or Java). Hands-on experience with data storage solutions (e.g., Delta Lake, Parquet, S3, BigQuery) Experience with data preparation for transformer-based models or LLMs Expertise in working with large-scale data frameworks (e.g., Spark, Kafka, Dask) Familiarity with MLOps tools (e.g., MLflow, Weights & Biases, SageMaker Pipelines) About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Job Title: Business Development Manager – IT Sales Experience: 5+ Years Location: Indore Industry: IT / Cloud / Data & AI Services Company Overview: Eagle in Cloud specializes in providing advanced IT services including AWS cloud solutions, Data Engineering, Artificial Intelligence/Machine Learning, and Data Science. Our focus is on delivering scalable and intelligent data-driven solutions to businesses across industries. Job Summary: We are seeking an experienced and self-driven Business Development Manager to spearhead sales and lead generation efforts. The ideal candidate will have a strong background in IT services sales and hands-on experience generating leads and closing deals through platforms such as Upwork, LinkedIn Sales Navigator, Clutch, and GoodFirms. Key Responsibilities: Identify and generate high-quality leads through Upwork, LinkedIn, Clutch, GoodFirms, and other B2B channels. Promote company services in AWS, Data Engineering, AI/ML, and Data Science to potential clients globally. Craft and deliver tailored proposals, presentations, and solutions aligned with client needs. Manage end-to-end sales cycle from prospecting to closure. Collaborate with technical and pre-sales teams to prepare solution-oriented sales pitches. Build and nurture long-term client relationships to generate recurring business. Maintain CRM records and provide regular updates on lead pipeline and sales performance. Requirements: Bachelor’s degree in Business, IT, or related field (MBA is a plus). 5+ years of experience in IT service sales or business development. Proven experience selling cloud-based solutions (preferably AWS), data engineering, and AI/ML services. Familiarity with bidding and freelance platforms like Upwork, Freelancer, etc. Proficient with lead generation tools, email campaigns, and CRM software (e.g., HubSpot, Zoho, Salesforce). Excellent communication, negotiation, and relationship-building skills. Ability to understand technical concepts and translate them into business value. Preferred Skills: Strong network in IT or data-driven industries. Understanding of modern tech stack (AWS, Python, Spark, ML frameworks, etc.). Experience in working with international clients (US, UK, EU markets preferred). Self-motivated, goal-oriented, and capable of working independently. Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Mysore, Karnataka, India
Remote
Enkefalos Technologies LLP., believes in creating a supportive and inclusive environment where innovation thrives. Working with us means collaborating with industry experts who are passionate about AI and next-generation tech solutions. At Enkefalos, you’ll find opportunities for career growth, continuous learning and working on exciting projects that challenge you to push boundaries. If you’re ready to embark on a rewarding career in AI and tech, explore our current job opening and become part of a team, that’s driving change through advanced GenAI solutions. Together, we can shape the future of industries worldwide. Databricks Engineer - Spark / PySpark Location: Remote / Mysore Joining: Immediate Experience : 5+ years Responsibilities: Will implement all cleansing, transformation, and semantic modeling logic on Databricks using PySpark, targeting financial facts and dimensions from SAP manual dumps. Requirements: PySpark (RDDs, DataFrames, performance tuning) Building gold‐layer data models for financial reporting Experience with complex joins, aggregations, GL hierarchies Version handling (Actuals vs Budget), currency conversions Show more Show less
Posted 3 days ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Req ID: 327296 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP Solution Architect to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Job Description: Primary Skill: Cloud-Infrastructure-Google Cloud Platform Minimum work experience: 8+ yrs Total Experience: 8+ Years Must have GCP Solution Architect Certification& GKE Mandatory Skills: Technical Qualification/ Knowledge: Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc.. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment , business case creation , design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Compute Engine , Compute Engine Managed Instance Groups , Kubernetes Cloud Storage , Cloud Storage for Firebase , Persistant Disk , Local SSD , Filestore , Transfer Service Virtual Private Network (VPC), Cloud DNS , Cloud Interconnect , Cloud VPN Gateway , Network Load Balancing , Global load balancing , Firewall rules , Cloud Armor Cloud IAM , Resource Manager , Multi-factor Authentication , Cloud KMS Cloud Billing , Cloud Console , Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE) , Anthos Service Mesh , Cloud Functions , PowerShell on GCP Solid understanding and experience in cloud computing based services architecture, technical design and implementations including IaaS, PaaS, and SaaS. Design of clients Cloud environments with a focus on mainly on GCP and demonstrate Technical Cloud Architectural knowledge. Playing a vital role in the design of production, staging, QA and development Cloud Infrastructures running in 24x7 environments. Delivery of customer Cloud Strategies, aligned with customers business objectives and with a focus on Cloud Migrations and DR strategies Nurture Cloud computing expertise internally and externally to drive Cloud Adoption Should have a deep understanding of IaaS and PaaS services offered on cloud platforms and understand how to use them together to build complex solutions. Ensure that all cloud solutions follow security and compliance controls, including data sovereignty. Deliver cloud platform architecture documents detailing the vision for how GCP infrastructure and platform services support the overall application architecture, interaction with application, database and testing teams for providing a holistic view to the customer. Collaborate with application architects and DevOps to modernize infrastructure as a service (IaaS) applications to Platform as a Service (PaaS) Create solutions that support a DevOps approach for delivery and operations of services Interact with and advise business representatives of the application regarding functional and non-functional requirements Create proof-of-concepts to demonstrate viability of solutions under consideration Develop enterprise level conceptual solutions and sponsor consensus/approval for global applications. Have a working knowledge of other architecture disciplines including application, database, infrastructure, and enterprise architecture. Identify and implement best practices, tools and standards Provide consultative support to the DevOps team for production incidents Drive and support system reliability, availability, scale, and performance activities Evangelizes cloud automation and be a thought leader and expert defining standards for building and maintaining cloud platforms. Knowledgeable about Configuration management such as Chef/Puppet/Ansible. Automation skills using CLI scripting in any language (bash, perl, python, ruby, etc) Ability to develop a robust design to meet customer business requirement with scalability, availability, performance and cost effectiveness using GCP offerings Ability to identify and gather requirements to define an architectural solution which can be successfully built and operate on GCP Ability to conclude high level and low level design for the GCP platform which may also include data center design as necessary Capabilities to provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project Understanding the significance of the different metrics for monitoring, their threshold values and should be able to take necessary corrective measures based on the thresholds Knowledge on automation to reduce the number of incidents or the repetitive incidents are preferred Good knowledge on the cloud center operation, monitoring tools, backup solution GKE Set up monitoring and logging to troubleshoot a cluster, or debug a containerized application. Manage Kubernetes Objects Declarative and imperative paradigms for interacting with the Kubernetes API. Managing Secrets Managing confidential settings data using Secrets. Configure load balancing, port forwarding, or setup firewall or DNS configurations to access applications in a cluster. Configure networking for your cluster. Hands-on experience with terraform. Ability to write reusable terraform modules. Hands-on Python and Unix shell scripting is required. understanding of CI/CD Pipelines in a globally distributed environment using Git, Artifactory, Jenkins, Docker registry. Experience with GCP Services and writing cloud functions. Hands-on experience deploying and managing Kubernetes infrastructure with Terraform Enterprise. Ability to write reusable terraform modules. Certified Kubernetes Administrator (CKA) and/or Certified Kubernetes Application Developer (CKAD) is a plus Experience using Docker within container orchestration platforms such as GKE. Knowledge of setting up splunk Knowledge of Spark in GKE Certification: GCP solution architect & GKE Process/ Quality Knowledge: Must have clear knowledge on ITIL based service delivery ITIL certification is desired Knowledge on quality Knowledge on security processes About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 3 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are hiring for one the IT big4 consulting Designation: - Associate/Associate Consultant Location : - Chennai/Gurgaon/Pune Skills Req-: AWS (Big Data services) - S3, Glue, Athena, EMR Programming - Python, Spark, SQL, Mulesoft,Talend, Dbt Data warehouse - ETL, Redshift / Snowflake Key Responsibilities: - Work with business stakeholders to understand their business needs. - Create data pipelines that extract, transform, and load (ETL) from various sources into a usable format in a Data warehouse. - Clean, filter, and validate data to ensure it meets quality and format standards. - Develop data model objects (tables, views) to transform the data into unified format for downstream consumption. - Expert in monitoring, controlling, configuring, and maintaining processes in cloud data platform. - Optimize data pipelines and data storage for performance and efficiency. - Participate in code reviews and provide meaningful feedback to other team members. - Provide technical support and troubleshoot issue(s). Qualifications: - Bachelor’s degree in computer science, Information Technology, or a related field, or equivalent work experience. - Experience working in the AWS cloud platform. - Data engineer with expertise in developing big data and data warehouse platforms. - Experience working with structured and semi-structured data. - Expertise in developing big data solutions, ETL/ELT pipelines for data ingestion, data transformation, and optimization techniques. - Experience working directly with technical and business teams. - Able to create technical documentation. - Excellent problem-solving and analytical skills. - Strong communication and collaboration abilities. Skillset (good to have) - Experience in data modeling. - Certified in AWS platform for Data Engineer skills. - Experience with ITSM processes/tools such as ServiceNow, Jira - Understanding of Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow Show more Show less
Posted 3 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Proficiency in AI tools used to prepare and automate data pipelines and ingestion Apache Spark, especially with MLlib PySpark and Dask for distributed data processing Pandas and NumPy for local data wrangling Apache Airflow – schedule and orchestrate ETL/ELT jobs Google Cloud (BigQuery, Vertex AI) Python (most popular for AI and data tasks) Show more Show less
Posted 3 days ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: SDET Experience 4 to 8 Years Location: Bangalore Skills Big Data Platform ( Spark Athena Redshift EMR) CI/CD pipelines Database Testing Java QA Automattion (Selenium) SQL queries EMR Jmeter kafka kinesis Project Highlights The project involved ensuring the quality and reliability of large-scale data systems through the development and automation of comprehensive test frameworks. You will design, automate, and execute tests for large-scale data applications, leveraging tools like Selenium, Great Expectations, and cloud platforms. Roles and Responsibilites Develop and maintain automated test frameworks using Selenium, Java, and data quality tools. Automate testing for data platforms (Spark, Athena, Redshift, EMR). Collaborate with development teams to resolve quality issues. Integrate automated tests into CI/CD pipelines. Conduct code reviews and provide feedback on best practices. Requirements 4-8 years in software testing with 2+ years in a senior role. Expertise in Selenium, Java, and data validation tools (Great Expectations). Experience with big data platforms (Spark, Athena, Redshift, AWS). Knowledge of SQL and database testing. Familiarity with CI/CD tools (Jenkins, GitLab Cl) Skills: amazon redshift,ci/cd pipelines,big data platform,java,emr,redshift,sql queries,sql,database testing,kafka,selenium,big data,jmeter,athena,ci/cd,kinesis,qa automation,spark Show more Show less
Posted 3 days ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
🚀 Copywriter – Words That Move People (and Metrics) We’re looking for a copywriter who can do more than just write — someone who can think , create , and convert . If you're obsessed with words, ideas, headlines, scroll-stopping content, and performance-backed storytelling, this one’s for you. As our Copywriter, you won’t just write copy — you’ll build narratives that elevate the brand, spark conversations, and drive results across all things marketing. From punchy social posts to full-funnel ad campaigns — you’ll be the voice behind the voice. What you’ll do: Develop scroll-stopping, thumb-pausing creative ideas that align with our brand tone (and break the internet occasionally) Write copy that sells, tells, compels — across ad campaigns, social, website, emails, WhatsApp, video scripts, and more Translate briefs into bold narratives and performance-friendly copy Dive into audience mindsets, brand insights, and cultural moments to craft messages that actually matter Collaborate with designers, marketers, and growth teams to bring campaigns to life Edit, tweak, and polish until every word is tight and every message lands Keep testing — what works, what doesn’t, and why — and double down on what drives results What you bring: A Bachelor’s degree in English, Journalism, Communications, Marketing, or anything that sharpened your pen 2+ years of experience slinging copy in fast-moving setups (EdTech or FinTech? Big plus!) A portfolio packed with smart, compelling work across digital formats A sixth sense for audience behavior, and how to speak their language Strong command over tone, structure, and storytelling Comfortable juggling creativity with conversion goals Bonus: Understanding of SEO, performance marketing, and how to write for both humans and algorithms If you think brand-first and write conversion-friendly — let’s talk. Bring your boldest ideas, your sharpest lines, and let’s make some noise. Show more Show less
Posted 3 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less
Posted 3 days ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less
Posted 3 days ago
10.0 years
0 Lacs
Greater Kolkata Area
On-site
Hiring for " Data Governance Lead" in a leading Bank. Location-Kolkata The Data Governance Lead will be responsible for establishing, implementing, and maintaining data governance policies, frameworks, and processes to ensure data quality, compliance, and security within the bank. Require any Bachelor’s/Master’s degree in Computer Science, Information Management, Data Science, Engineering, or a related field with Overall 10+ years of experience in data governance, data management, or related roles within the banking or financial services sector with experience in enterprise data integration and management and working with Data warehouse technologies and Data governance solutions. Having 3+ years of practical experience configuring business glossaries, making dashboards, creating policies etc and Executing at least 2 large Data Governance, Quality and Master Data Management projects from inception to production, working as technology expert. Also with 3+ years’ experience in Data Standardization, Cleanse, transform and parse data. Preferred Certifications-CDMP, DAMA, EDM Council, IQINT Proficiency in Data Governance/Quality tools(e.g., Collibra, Informatica, Alation etc.). Proficient in Data warehousing and pipelining&Good to have Data Engineering experience Experience in working on Python, Big Data, Spark etc. &Cloud platforms (AWS, GCP etc.) Show more Show less
Posted 3 days ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Who we are and what do we do NPST is a fintech company bridging the banking and fintech worlds with its product suite of technology and payments, for over 10 years. We provide software and digital payment solutions to the BFSI Industry as a Technology service provider. We function as a Technology Service Provider (TSP) and a Third-Party Aggregator Provider (TPAP), catering to stakeholders across the financial value chain, including banks, merchant aggregators, merchants, and consumers. We got listed targeting SME IPO in Aug – 2021 on the NSE Emerge platform with a market cap of 2000 Cr (as of Mar’24) and became NPCI- an approved Merchant Payment Service Provider, acquiring merchants and facilitating payment. NPST has a marquee clientele having 10 Banks and 30+ PAPG and Merchants. What will you do We are looking for Java Developers with experience in building high-performing, scalable, enterprise-grade applications for Banking/Fintech domain specifically IMPS/UPI. You will be part of a talented software team. Roles and responsibilities include managing JAVA/ JAVA EE application development while providing expertise in full software development lifecycle, from concept and design to testing. Job responsibilities: (Need Immediate joiners only) Hands-on experience in design and defining architecture of complex web-based applications. Building distributed application using Core JAVA, Spring, Spring boot. Experience in ORM frameworks such as Hibernate/JPA. Working experience with SQL and NoSQL databases such as MySQL, PostgreSQL, Oracle, MongoDB/Cassandra. Hands-on experience on Microservices, REST. Continuous Integration tools like Jenkins, GitHub. Knowledge of Cloud such as Amazon Web Services. Hands-on knowledge of build tools like Maven, Gradle. Experience working with Git, Junit, Mockito, JIRA and similar tools. Experience on Apache Spark & other Machine Learning tools will be added advantage. What are we looking for: Hands on experience inn Spring, Spring Boot, Hibernate, RDBMS, NoSQL, multithreading, Java, REST, xml parsers, Web Services, MySQL, Oracle, Cassandra, MongoDB Entrepreneurial skills, ability to observe, innovate and own your work. Detail-oriented and organized with strong time management skills. Influencing skills and the ability to create positive working relationships with team members at all levels. Excellent communication and interpersonal skills. A collaborative approach and work with perfection as a group effort to achieve organization goal. Education Qualification - Btech/BCA/MCA/MTech Experience – 3-5 years. Industry - IT/Software/BFSI/ Banking /Fintech Work arrangement – 5 days working from office Location – Noida, Bangalore What do we offer: An organization where we strongly believe in one organization, one goal. A fun workplace which compels us to challenge ourselves and aim higher. A team that strongly believes in collaboration and celebrating success together. Benefits that resonate ‘We Care’. If this opportunity excites you, we invite you to apply and contribute to our success story. If your resume is shortlisted, you will hear back from us. Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Title: Senior Data Engineer (AWS Expert) Location: Ahmedabad Experience: 5+ Years Company: IGNEK Shift Time: 2 PM - 11 PM IST About IGNEK: IGNEK is a fast-growing custom software development company with over a decade of industry experience and a passionate team of 25+ experts. We specialize in crafting end-to-end digital solutions that empower businesses to scale efficiently and stay ahead in an ever-evolving digital world. At IGNEK, we believe in quality, innovation, and a people-first approach to solving real-world challenges through technology. We are looking for a highly skilled and experienced Data Engineer with deep expertise in AWS cloud technologies and strong hands-on experience in backend development, data pipelines, and system design. The ideal candidate will take ownership of delivering robust and scalable solutions while collaborating closely with cross-functional teams and the tech lead. Key Responsibilities: ● Lead and manage the end-to-end implementation of cloud-native data solutions on AWS. ● Design, build, and maintain scalable data pipelines (PySpark/Spark) and data lake architectures (Delta Lake 3.0 or similar). ● Migrate on-premises systems to modern, scalable AWS-based services. hr@ignek.com +91-9328495160 www.ignek.com ● Engineer robust relational databases using Postgres or Oracle with a strong understanding of procedural languages. ● Collaborate with the tech lead to understand business requirements and deliver practical, scalable solutions. ● Integrate newly developed features following defined SDLC standards using CI/CD pipelines. ● Develop orchestration and automation workflows using tools like Apache Airflow. ● Ensure all solutions comply with security best practices, performance benchmarks, and cloud architecture standards. ● Monitor, debug, and troubleshoot issues across multiple environments. ● Stay current with new AWS features, services, and trends to drive continuous platform improvement. Required Skills and Experience: ● 5+ years of professional experience in data engineering and backend development. ● Strong expertise in Python, Scala, and PySpark. ● Deep knowledge of AWS services: EC2, S3, Lambda, RDS, Kinesis, IAM, API Gateway, and others. hr@ignek.com +91-9328495160 www.ignek.com ● Hands-on experience with Postgres or Oracle, and building relational data stores. ● Experience with Spark clusters, Delta Lake, Glue Catalogue, and large-scale data processing. ● Proven track record of end-to-end project delivery and third-party system integrations. ● Solid understanding of microservices, serverless architectures, and distributed computing. ● Skilled in Java, Bash scripting, and search tools like Elasticsearch. ● Proficient in using CI/CD tools (e.g., GitLab, GitHub, AWS CodePipeline). ● Experience working with Infrastructure as Code (Iac) using Terraform. ● Hands-on experience with Docker, containerization, and cloud-native deployments. Preferred Qualifications: ● AWS Certifications (e.g., AWS Certified Solutions Architect or similar). ● Exposure to Agile/Scrum project methodologies. ● Familiarity with Kubernetes, advanced networking, and cloud security practices. ● Experience managing or collaborating with onshore/offshore teams. hr@ignek.com +91-9328495160 www.ignek.com Soft Skills: ● Excellent communication and stakeholder management. ● Strong leadership and problem-solving abilities. ● Team player with a collaborative mindset. ● High ownership and accountability in delivering quality outcomes. Why Join IGNEK? ● Work on exciting, large-scale digital transformation projects. ● Be part of a people-centric, innovation-driven culture. ● A flexible work environment and opportunities for continuous learning. How to Apply: Please send your resume and a cover letter detailing your experience to hr@ignek.com Show more Show less
Posted 3 days ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Hiring: Chief Technology Officer (CTO) Location: Pune, India | Type: Full-Time We are a fast-growing AdTech company seeking a visionary Chief Technology Officer (CTO) to lead our technology strategy and drive innovation across our platforms. This is a unique opportunity to shape the future of programmatic advertising and real-time bidding (RTB) by leading high-impact engineering and product teams. As CTO, you will define and execute the technology roadmap, oversee the architecture and scaling of RTB, DSP, and SSP systems, and drive advancements in AI/ML, big data, and real-time analytics. You’ll collaborate closely with leadership and cross-functional teams to deliver scalable, secure, and high-performance tech solutions. Requirements: 10+ years of experience in software engineering, with 5+ years in leadership roles Deep expertise in AdTech (RTB, DSP, SSP, header bidding, OpenRTB) Strong background in AI/ML, cloud platforms (AWS/GCP), and big data technologies (Kafka, Spark, Hadoop) Proven track record in building scalable backend systems and leading agile teams Why Join Us: Competitive compensation with equity options Lead the tech vision of a high-growth company Work on cutting-edge products with global impact Show more Show less
Posted 3 days ago
10.0 years
0 Lacs
Barmer, Rajasthan, India
On-site
Attention Rig Drillers! Flash Tech Consulting is on the hunt for a seasoned Rig Driller for our onshore operations. This is your gateway to an exciting opportunity where your expertise drives excellence in the field. What We’re Looking For: • 10+ Years of Experience & 4-5 Years Experience as a Driller: Show us your deep-rooted skills and a robust track record in drilling. • Proven Land Rig Experience: You've mastered the challenges of onshore drilling. • High Pressure & High Temperature Expertise: You know how to safely and efficiently handle the toughest well conditions. • Cutting-Edge Technology Savvy: Experience with the Amphion Cyber Chair and NOV operating system is a must. How to Apply: 1. Prepare Your Resume: Highlight your drilling accomplishments and technical expertise. 2. Detail Your Experience: Explain how you’ve handled high-pressure and high-temperature wells on land rigs. 3. Showcase Your Tech Know-How: Let us know about your experience with industry-leading tools like the Amphion Cyber Chair and NOV operating system. Step up and seize this opportunity to be a part of our dynamic team. Apply Now and take your career to new heights! #India #FlashTechConsulting #Hiring #RigDriller #Driller #onshore #ApplyNow Elevate your journey in the drilling world—your expertise is the spark that ignites our future! Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our technology services client is seeking multiple Azure Data & Analytics Engineer to join their team on a contract basis. These positions offer a strong potential for conversion to full-time employment upon completion of the initial contract period. Below are further details about the role: Role: Azure Data & Analytics Engineer Mandatory Skills: Agile Methodoligies, Python, Databricks, Azure Cloud, Data Factory, Data Validations Experience: 5- 8 Years Location: PAN India Notice Period: 0-15 days Job Description: 5 years of software solution development using agile, devops, product model that includes designing, developing, and implementing large-scale applications or data engineering solutions. 5+ years of Data Analytics experience using SQL 5+ years full-stack development experience, preferably in Azure 5+ years of cloud development (prefer Microsoft Azure) including Azure EventHub, Azure Data Factory, Azure Functions, ADX, ASA, Azure Databricks, Azure DevOps, Azure Blob Storage, Azure Power Apps, and Power BI. 1+ years of FAST API experience is a plus Airline Industry Experience Skills, Licenses & Certifications Expertise with the Azure Technology stack for data management, data ingestion, capture, processing, curation and creating consumption layers. Azure Development Track Certification (preferred) Spark Certification (preferred) If you are interested, share the updated resume to sushmitha.r@s3staff.com Show more Show less
Posted 3 days ago
6.0 years
0 Lacs
India
Remote
Job Title: Azure Data Engineer (6 Years Experience) Location: Remote Employment Type: Full-time Experience Required: 6 years Job Summary: We are seeking an experienced Azure Data Engineer to join our data team. The ideal candidate will have strong expertise in designing and implementing scalable data solutions on Microsoft Azure, with a solid foundation in data integration, data warehousing, and ETL processes. Key Responsibilities: Design, build, and manage scalable data pipelines and data integration solutions in Azure Develop and optimize data lake and data warehouse solutions using Azure Data Lake, Azure Synapse, and Azure SQL Create ETL/ELT processes using Azure Data Factory Implement data modeling and data architecture best practices Collaborate with data analysts, data scientists, and business stakeholders to deliver reliable and efficient data solutions Monitor and troubleshoot data pipelines for performance and reliability Ensure data security, privacy, and compliance with organizational standards Automate data workflows and optimize performance using cloud-native tools Required Skills: 6 years of experience in data engineering roles Strong experience with Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, and Azure SQL Proficiency in SQL, T-SQL, and scripting languages (Python or PowerShell) Hands-on experience with data ingestion, transformation, and orchestration Good understanding of data modeling (dimensional, star/snowflake schema) Experience with version control (Git) and CI/CD in data projects Familiarity with Azure DevOps and monitoring tools Preferred Skills: Experience with Databricks or Spark on Azure Knowledge of data governance and metadata management tools Understanding of big data technologies and architecture Microsoft Certified: Azure Data Engineer Associate (preferred) Show more Show less
Posted 3 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less
Posted 3 days ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less
Posted 3 days ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Greeting from Infosys BPM Ltd., Exclusive Women's Walkin drive We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 20th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 20th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less
Posted 3 days ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description We are seeking an experienced Social Media Marketing Manager to lead and execute our social media strategies across various platforms. This role will focus on creating engaging content, growing kPaisa’s social media presence, and building meaningful connections with our audience to drive brand awareness, user acquisition, and customer retention. Roles And Responsibilites Develop and implement social media strategies to increase brand visibility and drive user engagement. Create, curate, and manage content across all social media platforms (Facebook, Instagram, Twitter, LinkedIn, YouTube, etc.). Collaborate with the marketing team to align social media campaigns with broader marketing goals and objectives. Monitor social media trends and adapt strategies to stay ahead of competitors and emerging trends. Engage with followers and respond to comments, messages, and mentions to maintain a positive brand image. Analyze social media performance using analytics tools and provide regular reports on key metrics (reach, engagement, conversion). Work closely with influencers and partners for collaborations and cross-promotions. Run paid social media campaigns to drive traffic, increase brand awareness, and boost conversions. Skills Required 2–5 years of experience in social media marketing or content management, ideally in fintech or BFSI. Strong understanding of social media platforms, their algorithms, and trends. Excellent written and verbal communication skills. Experience with social media management tools (e.g., Hootsuite, Buffer, Sprout Social) and analytics tools (e.g., Google Analytics, Facebook Insights). Ability to create visually appealing content and work with design tools (e.g., Canva, Adobe Spark). Knowledge of paid social media advertising and running campaigns on Facebook Ads, LinkedIn Ads, etc. Strong creative mindset with an ability to think outside the box and engage audiences in a crowded digital space. Good project management skills with an ability to multitask and meet deadlines. What we have to offer Flexible work hours First hand fintech development opportunity Meritocracy driven, candid startup culture Show more Show less
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The demand for professionals with expertise in Spark is on the rise in India. Spark, an open-source distributed computing system, is widely used for big data processing and analytics. Job seekers in India looking to explore opportunities in Spark can find a variety of roles in different industries.
These cities have a high concentration of tech companies and startups actively hiring for Spark roles.
The average salary range for Spark professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum
Salaries may vary based on the company, location, and specific job requirements.
In the field of Spark, a typical career progression may look like: - Junior Developer - Senior Developer - Tech Lead - Architect
Advancing in this career path often requires gaining experience, acquiring additional skills, and taking on more responsibilities.
Apart from proficiency in Spark, professionals in this field are often expected to have knowledge or experience in: - Hadoop - Java or Scala programming - Data processing and analytics - SQL databases
Having a combination of these skills can make a candidate more competitive in the job market.
As you explore opportunities in Spark jobs in India, remember to prepare thoroughly for interviews and showcase your expertise confidently. With the right skills and knowledge, you can excel in this growing field and advance your career in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.