Home
Jobs

Gcp Data Engineer

5 - 10 years

5 - 15 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

IntraEdge is looking for BigData Engineers/developers who will work on the collecting, storing, processing, and analyzing of huge sets of data. One will also be responsible for integrating them with the architecture used across the company. Responsibilities- Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities. Partners with architects and other senior leads to address the data needs Partners with Data Scientists and product teams to build and deploy machine learning models that unlock growth Build custom integration and data pipelines between cloud-based systems using APIs Write complex and efficient code to transform raw data sources into easily accessible models by coding several languages such as Python, Scala or SQL . Design, develop and test a large-scale, custom-distributed software system using the latest Java, Scala and Big data technologies. Actively contribute to the technological strategy definition (design, architecture and interfaces) in order to effectively respond to our client's business needs Participate in technological watch and the definition of standards to ensure that our systems and data warehouses are efficient, resilient and durable Experienced in using Informatica or similar products, with an understanding of heterogeneous data replication techniques Build data expertise and own data quality for the pipelines you create. Skills and Qualifications- Bachelor/Masters degree in Computer Science, Management of Information Systems or equivalent. 4 or more years of relevant software engineering experience ( Big Data: Hive, Spark, Kafka, Cassandra, Scala, Python, SQL ) in a data-focused role. Experience in GCP Building batch/streaming ETL pipelines with frameworks like Spark, Spark Streaming and Apache Beam and working with messaging systems like Pub/Sub and Kafka . Working experience with Java tools or Apache Camel. Experience in designing and building highly scalable and reliable data pipelines using Big Data ( Airflow, Python, Redshift/Snowflake ) Software development experience with proficiency in Python, Java, Scala or another language. Good knowledge of Big Data querying tools, such as Hive, and experience with Spark/PySpark Good knowledge of SQL, Good Knowledge of Python Ability to analyse and obtain insights from complex/large data sets Design and develop highly performing SQL Server database objects Experience- 5-10 Years Notice period- Serving NP/Immediate joiners/Max 30 days Location- Gurugram/Bangalore/Pune/Remote Salary- Decent hike on Current CTC

Mock Interview

Practice Video Interview with JobPe AI

Start Pyspark Interview Now

My Connections IntraEdge Technology

Download Chrome Extension (See your connection in the IntraEdge Technology )

chrome image
Download Now
IntraEdge Technology
IntraEdge Technology

Information Technology

Phoenix

51-200 Employees

54 Jobs

    Key People

  • John Doe

    CEO
  • Jane Smith

    CTO

RecommendedJobs for You

Chennai, Sholinganallur

Hyderabad, Pune, Mumbai (All Areas)

Navi Mumbai, Gurugram, Mumbai (All Areas)