At Apple, outstanding ideas have a way of becoming phenomenal products, services, and customer experiences very quickly. Bring passion and dedication to your job and there’s no telling what you could accomplish. Would you like to work in a fast-paced environment where your technical abilities will be challenged on a day-to-day basis?
If you are passionate about building end-to-end large scale data solutions, Apple Global Business Intelligence team is looking for a seasoned Data Warehouse Solutions Engineer with a deep understanding of ETL and Data modeling concepts. Apple’s Enterprise Data Warehouse landscape caters to a wide variety of real-time, near real-time and batch analytical solutions. These solutions are integral part of business functions like Sales, Operations, Finance, Apple Care, Marketing and Internet Services, enabling business drivers to make critical decisions.
In this role, you will be part of large development team designing and building systems across a diverse Big data solutions ( MPP databases like Teradata, Vertica, Hadoop) nosql solutions ( Cassandra) and other big data technology stacks – Kafka, Spark and beyond. You will define standards, methodologies and help drive adoption of our latest frameworks. You will be directly responsible and accountable for critical data solutions across various business functions.
- Bachelors / Masters Degree or equivalent
- We would like for you to have In-depth understanding of data structures and algorithms and end-to-end solutions design
- We are looking for experience in managing and processing large data sets distributed on multi-server, distributed systems from inception to execution. Experience with Columnar, noSQL & KV Databases @ big data scale preferred.
- Big Data ecosystem programming experience highly desirable, especially in Java, Spark, Kafka
- Architecting & Implementing from scratch serveless architecture & containerized micro services is a big plus.
- Experience in designing and building dimensional data models to improve accessibility, efficiency, and quality of data
- Programming experience in building high quality software. Skills with Java, or Scala preferred
- Experience in designing and developing ETL data pipelines. Should be proficient in writing Advanced SQLs, Expertise in performance tuning of SQLs
- Expert knowledge of distributed computing, parallel programming, concurrency control, transaction processing.
- Demonstrate strong understanding of development processes and agile methodologies
- Strong analytical and communication skills
- Self-driven, highly motivated and ability to learn quick
- Experience with or advance courses on data science and machine learning is a plus
- Work/project experience with big data and advanced programming languages is a plus
As a Data Warehouse Solutions Engineer, you will:
- Drive, design and develop data processing pipelines, applications and tools, which promote product stability, reliability and maintainability.
- Design and build data structures on MPP platform like Teradata or Hadoop, to provide efficient reporting and analytics capability.
- Design and build highly scalable data pipelines using new generation tools and technologies like Spark and Kafka.
- Lead innovative efforts in processing data accurately, at scale.
- Build, deploy and support production services in distributed environments – Mentor other developers, define standards, best practices and help drive adoption
- Benchmark application performance and continue to tune and scale to accommodate growth
- We seek a self starter, visionary person with strong leadership capabilities Ability to communicate effectively, both written and verbal, with technical and non-technical multi-functional teams
- You will interact with many other group’s internal team to lead and deliver best-in-class products in an exciting fast-paced environment
- Dynamic smart people and inspiring, innovative technologies are the norm here.
Job/Req. ID: 114185582
Company: Apple Inc.
Job Category: Software/IT
This job is no longer available, search other active jobs on Home Page.