San Francisco Hadoop Users Message Board › Charles Schwab / Sr Staff - Software Application Engineer (Hadoop/Data Intel

Charles Schwab / Sr Staff - Software Application Engineer (Hadoop/Data Intellegence) / San Francisco

A former member
Post #: 8
To apply for this opportunity please follow this link:
Charles Schwab Sr Software Applicaton Engineer

Charles Schwab has been a leader in financial services for nearly four decades, working to make investing more affordable, accessible and understandable to all. Driven by our purpose to champion every client’s goals with passion and integrity, we’re committed to providing an environment that respects and appreciates the diversity of our employees, our clients, and the communities we serve. Our goal, as seen through clients' eyes, is that Schwab continuously improves on being a premier financial service provider through best in class service, technology, products, people and advice.

Organizational Objective/Purpose:

Hadoop Information Platform Solutions is seeking a Java Developer to support our Hadoop initiatives. This role helps support data intelligence efforts at Schwab that will evolve Schwab's marketing and analytics capabilities. This role is an absolutely critical part of helping our business partners make informed business decisions that lead to best of breed client communications and analytical direction.

Brief Description of Role:

In this role, you will:
• Partner in designing/architecting new solutions for the Hadoop platform, including real time processing
• Regularly interface with technical, marketing, analytics and our client experience partners in order to provide clean, re-usable data of the highest quality, and then process it to help them develop analytical direction for marketing
• Play a lead role in Hadoop development, data analysis, production control and quality assurance
• Help us fine tune our data quality control processes to help ensure pristine data accuracy /data quality
In addition to highly efficient and accurate Hadoop, Perl, SQL programming skills (and ideally some strong Linux/UNIX Shell scripting skills, as well), this role also requires customer support skills

Technical/Functional Qualifications:

To be successful in this role, you should have:
• Hadoop/PIG/HIVE development experience
• Data Science or Data Intelligence experience
• Strong SQL programming, data mining and data analysis experience
• Strong Java, Perl, Shell, UNIX & LINUX
• Experience monitoring complex production systems, with an emphasis on scheduling and reporting
• The ability to carry out complex assignments requiring both data analysis skills and a solid understanding of relevant business issues
• Strong data mining and data analytics expertise, capable of researching Schwab Systems of Record and databases independently and confidently
• A knack for developing solid processes through efficient allocation of resources, good design and quality control techniques
• Bachelor's degree in Computer Science, Business, Engineering, Mathematics or equivalent
Powered by mvnForum

Our Sponsors

  • Cloudera

    Cloudera is the organizer of this meetup.

People in this
Meetup are also in:

Sign up

Meetup members, Log in

By clicking "Sign up" or "Sign up using Facebook", you confirm that you accept our Terms of Service & Privacy Policy