Q&A with Baylor Medical College HGSC CISO B. Kim Andrews, Ph.D.

Higher education cybersecurity expert shares his take on keeping the college secure

Recently TalaTek sat down with the CISO of Baylor Medical College, Human Genome Sequencing Center B. Kim Andrews, who also is a client, to discuss his views on cybersecurity.

Dr. Andrews shared his take on the pressing cyber issues affecting higher education given the number of diverse, and often conflicting requirements, he and his team experiences in serving a broad range of constituents.

Here’s what he had to say:

Q: What cyber threats should faculty/staff be aware of as they do their jobs?

  • Cyberattacks are all the time, everywhere, and increasing with time. Denial reflects ignorance, since:
    • Hackers/attackers learn to evolve and adapt their methods from each successful defense of their malware, which means:
    • Any security in place at any point in time necessarily becomes obsolete if it cannot manage change and adapt.
  • Academic campuses are honey-pots for attackers relative to government and industrial IT environments because:
    • Executive leadership is faculty/center-focused, and their widely diverse needs push back on IT staff mandates to establish and maintain standard practices.
    • Identifying and establishing security as an institutional priority must come from the top down. However, a successfully implemented plan requires that faculty/staff are willing to work with their IT security staff to adapt their workflows/processes to the new paradigm, and with security professionals who can execute this migration with minimal impact on their work and workflows.
    • They generally are required to have a more open computing environment, restricting activities only when mandated or when it becomes necessary.
    • They typically have older operating systems and libraries that are locked into community-developed discipline-specific software as well as (particularly for laboratory instruments) operating systems that are anchored to proprietary and expensive component drivers.

Q: What are the consequences of not following best practices, and not being proactive?

  • Besides the most obvious consequence, that of being hacked because of a system vulnerability, which could result in the usual litany of issues:
    • Loss of data
    • Loss of reputation
    • Interruption of business
    • Financial loss for the institution
    • Financial loss for individuals within the institution
    • Future stringent, mandated compliance
  • There is also an intrinsic value to following best practices because they help protect you from an even more likely and insidious evil – human error.
  • Almost every system admin, software developer or software engineer that I have taken through the process of implementing a compliance security framework has noted and admitted that there were no compliance activities (outside of the administrative burden of reporting) that did not add value to their workflows and just make good sense to practice.
  • The main consequence is what I call “anchor and drift”. Hardware and operating systems must continually evolve, or drift forward, to remain secure from hackers. However, they are “anchored” to the requirements of the software, libraries and operating systems (OSs) that are used to conduct the enterprise. Unless the entire cyberinfrastructure, as a whole, drifts forward together, then the entire system will drift apart:
    • If software applications become anchored, a critical subset of vulnerable/unmanaged software can neither be secured nor survive a security challenge, requiring the institution to maintain BIA’s into perpetuity to accommodate unsupportable, mission-critical vulnerable applications.
    • If, much worse, the cyberinfrastructure is dedicated to anchored software, new OS security patches are now unable to be applied because they “break” applications and libraries for the anchored software.
      • Once the OS is anchored to this software, new applications cannot be ported onto the (older) OS because the OS is anchored to an OS version that is required by some critical older software.
    • Old software becomes unsupportable on new machines because code is anchored to an older OS and must be relegated to network isolation or (IF possible) maintaining insecure VMs that mimic the old OS into perpetuity.


Q. What actionable steps can institutions take to protect themselves (best practices, tools or software they can use, etc.)

I’ll answer this in two parts. First, a few specific operational tools that we use, which are free, since, in academia, you need as much free software as you can use. We deploy a mix of commercial and open source industry-standard software to manage (Puppet, Chef, RedHat Satellite server) and monitor resources (Nagios, Solar Winds, Cacti, Splunk), schedule batch jobs (Moab, Torque), monitor system component availability and utilization (Ganglia + internally developed applications), track problems (Request Tracker), manage hardware and software assets (RackView) and manage resource allocation and reporting (Splunk, Moab Accountability Manager).

Secondly, if you haven’t already begun, here are some strategic and tactical steps to begin the enculturation of security across your institution.

Perform a risk assessment (preferably from a third party vendor to maximize efficacy and avoid politics). In any environment, you need to gather hard data use this information to:

  • Inform and educate executive leadership as to potential security risks and the consequences to the institution of ignoring them.
  • Drive engagements among IT security advocates to the institution’s business and financial decision-makers that can reveal risk posture and show real value in adopting a more secure environment

Help build consensus with the faculty and staff by communicating these gaps and how one can close them.

NOTE: I assume that government & industry IT environments have a fundamentally different paradigm and attitude toward IT Security than found in academia. There are many, many reasons, but the difference can easily be seen in something as simple as the pushback that academic faculty/staff members make when IT attempts to install security client software on a faculty member’s PC machine relative to the mandated and fully-managed PC software environments found in corporate and government institutions.