Job Description:Position SummarySenior-level Hadoop Tech Manager responsible for leading multiple projects in the hadoop data lake, including Hadoop technical deliverables as per business needs. Derive specifications from business requirements, design solutions that support core organizational functions, and ensure their high availability and other performance goals. Provide integrated infrastructure-related technical expertise across the organization, from conceptualization and project planning to the post-implementation support level. Responsible for assisting in the complete life cycle of a Hadoop project implementation. Assist in creating big data architecture patterns/frameworks for ETL activities and analytics.Essential Duties and Responsibilities:Following is a summary of the essential functions for this job. Other duties may be performed, both major and minor, which are not mentioned below. Specific activities may change from time to time.Sound understanding and experience with Hadoop ecosystem (Cloudera). Able to understand and explore the constantly evolving tools within Hadoop ecosystem and apply them appropriately to the relevant problems at hand.Experience in working with a Big Data implementation in production environmentExperience in Java, Spark, Map Reduce, RDBMS, Hive, Pig, impala, Scala, Kafka, flume, Linux/Unix technologies.Sound knowledge of relational databases (SQL) and experience with large SQL based systems.Strong IT consulting experience in various data warehousing engagement, handling large data volumes, architecting big data environments.Deep understanding of algorithms, data structures, performance optimization techniques and software development in a team environment.Benchmark and debug critical issues with algorithms and software as they arise.Lead and assist with the technical design/architecture and implementation of the big data cluster in various environments.Able to guide/mentor development team for example to create custom common utilities/libraries that can be reused in multiple big data development efforts.Exposure to ETL tools e.g. data stage, NoSQL (HBase, Cassandra, MongoDB)Provide infrastructure system expertise, requirements and assistance to Systems and Database Administrators, other System Architects and application development teams.Work with line of business (LOB) personnel, external vendors, and internal Data Services team to develop system specifications in compliance with corporate standards for architecture adherence and performance guidelines.Provide technical resources to assist in the design, testing and implementation of software code and infrastructure to support data infrastructure and governance activities.Assist in both external and internal audit questionnaires and application assessments.Assess current technical architecture and estimate system capacity to meet near- and long-term processing requirements.Evaluate, select, test, and optimize hardware and software products.Document the Bank's existing solution architecture and technology portfolio and make recommendations for improvements and/or alternatives.Liaise with Enterprise Architects to conduct research on emerging technologies, and recommend technologies that will increase operational efficiency, infrastructure flexibility and operational stability.Develop, document, communicate and enforce a policy for standardizing systems and software, as necessary.Instruct, direct and mentor other members of the team.Support multiple projects with competing deadlinesGood understanding of Lambda Architecture, along with its advantages and drawbacksRequired Skills:Strong leadership and Hadoop technical / architecture understanding to interact and manage cross functional team of architects, data engineers, information analysts and platform teams.Experienced with managing direct reports and career management of technical resources.Knowledge of Big data on cloud.Experienced collaborating with Big data partners such as Cloudera, Horton Works etc. for upgrades, issues/escalation management, etc.Experienced managing SOWs and vendors.Bachelor's degree in a technical or business-related field, or equivalent education and related trainingTen years of experience in data warehousing architectural approaches and minimum 4 years in big data (Cloudera)Exposure to and strong working knowledge of distributed systemsExcellent understanding of client-service models and customer orientation in service deliveryAbility to derive technical specifications from business requirements and express complex technical concepts in terms that are understandable to multi-disciplinary teams, in verbal as well as in written formAbility to grasp the 'big picture' for a solution by considering all potential options in impacted areaAptitude to understand and adapt to newer technologiesDemonstrated proficiency in basic computer applications, such as Microsoft Office software productsProvide technical architecture and consulting supportAssist in the evaluation of new solutions for integration into the Hadoop Roadmap/StrategyParticipate in deep architectural discussions to build confidence and ensure customer success when building new and migrating existing applications, software and services on the platformMotivate internal and external resources to deliver on project commitmentsThe desire to learn new soft and technology skills and the desire to coach, mentor and train peers throughout the organizationThe ability to work with team mates in a collaborative manner to achieve a missionPresentation skills to prepare and present to large and small groups on technical and functional topicsAbility to travel, occasionally overnightDesired Skills:Previous experience in the financial services industryBroad BofA technical experience and good understanding of existing testing/operational processes and an open mind on how to enhance thoseUnderstanding of industry trends and relevant application technologiesExperience in designing and implementing analytical environments and business intelligence solutions