Principal Engineer I-Advanced Analytics
Posted on: June 19, 2017
Client Reference Code: 196035
Charter Communications is America s fastest growing TV, internet and voice company. We re committed to integrating the highest quality service with superior entertainment and communications products. Charteris at the intersection of technology and entertainment, facilitating essential communications that connect 24 million residential and business customers in 41 states. Our commitment to serving customers and exceeding their expectations is the bedrock of Charter s business strategy and it s the philosophy that guides our 90,000 employees.
Advanced Analytics has implemented and is operating a new advanced Big Data analytics platform that has enabled new self- service analytics, decision engineering support, machine learning, modeling, forecasting, and optimizations. It is anticipated that by the end of 2017, there will be 2.6 Peta Bytes of complex analytics data sets supporting Charter s Advanced Engineering organization. This position is responsible to create and maintain scalable, reliable, consistent and repeatable systems that support data operations for Advanced Analytics by gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.). It profiles data to measure quality, integrity, accuracy, and completeness and delivers solutions by developing, testing, and implementing code and scripts via (but not limited to) Python, Perl, shell scripts, etc. It requires the management of the life cycle of multiple data sources and the increase in speed to deliver by implementing workload/workflow automation solutions.
MAJOR DUTIES AND RESPONSIBILITIES
Create and maintain scalable, reliable, consistent and repeatable systems that support data operations for Advanced Analytics.
Gather and process raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.)
Profile data to measure quality, integrity, accuracy, and completeness
Develop and implement tools, scripts, queries, and applications for ETL/ELT and data operations.
Ability to use a wide variety of open source technologies and cloud services.
A working ability to deliver solutions by developing, testing, and implementing code and scripts via (but not limited to) Python, Perl, shell scripts, etc.
IP, DNS, DHCP, and security configuration and administration experience on Linux/Unix/CentOS and Windows.
Produces reports and uphold data delivery schedules
Manages life cycle of multiple data sources
Works closely with stakeholders on the data demand side (analysts and data scientists).
Works closely with stakeholders on the data supply side (domain experts on source systems of the data).
Builds self-monitoring, robust, scalable interfaces and data pipelines for 24/7 operations.
Creates highly reusable code modules and packages that can be leveraged across the data pipeline.
Increases speed to delivery by implementing workload/workflow automation solutions.
A focus on key business results. Demonstrates care about customers, are metrics-driven, and can communicate the costs and tradeoffs of your ideas to top management.
Strong experience with SQL, MySQL, and other database technologies with NoSQL and columnar data store experience a plus.
Strong background in Linux/Unix/CentOS installation and administration; Windows experience a plus.
Expertise in data storage that demonstrates knowledge of when to use a file system, relational database, or NoSQL variant.
Ability to identify and resolve end-to-end performance, network, server, and platform issues.
Keen attention to detail with the ability to effectively prioritize and execute multiple tasks.
Ability to read, write, speak and understand English.
Familiarity with data workflow/data prep platforms, such as Alteryx, Pentaho, or KNIME.
Familiarity with automation/configuration management using either Puppet, Chef or an equivalent.
Knowledge of best practices and IT operations in an always-up, always-available service.
Experience receiving, converting, and cleansing big data.
Experience with visualization or BI tools, such as Tableau, Zoomdata, Microstrategy, or anything Microsoft Power BI.
Creating proof of concept experiments for analytics, machine learning, or visualization tools that include hypothesis, test plans, and outcome analysis.
Bachelor's degree in an engineering discipline or computer science.
Master of Science in an engineering discipline, computer engineering or computer science.
RELATED WORK EXPERIENCE
7- 10+ years of Linux/Unix/CentOS system admin; Windows experience a plus.
7-10+ years of hands-on working experience with RDBMS, SQL, scripting, and coding.
Experience delivering one major system where candidate was responsible for designing the architecture, implementing, operating, and supporting.
Charter Technical Engineering Center
Highly collaborative and innovative work space
Charter is an equal opportunity employer that complies with the laws and regulations set forth in the following EEO Is the Law poster:
Charter is committed to diversity, and values the ways in which we are different.
Job Code : TWCEGN330 Principal Engineer I Exempt
Keywords: Spectrum, Fort Collins, Principal Engineer I-Advanced Analytics, Engineering, Englewood , Colorado
Didn't find what you're looking for? Search again!