Also In This Section
  • Topics

  • Recent Posts

  • Tag: l-brown

    MTU Hosts International Collegiate Programming Contest (ICPC)

    Saturday Oct. 28th Michigan Tech hosted a site of the North Central North American (NCNA) region of the International Collegiate Programming Contest (ICPC).  Locally, 11 teams competed, 7 from Michigan Tech and 4 from NMU.  Across the region there were 207 teams competing.  The top team in the region from South Dakota School of Mines solved 8 of the 10 problems in the five hour competition.  The following were the top three teams from Michigan Tech all solving 4 problems (full standings are available at: https://ncna17.kattis.com/standings) :
    • MTU White, region rank 12
      Anthony Marcich, 4th year Math major
      Nick Olinger, 3rd year Math major
      Jay Honnold, 4th year CS major
    • MTU Red, region rank 13
      Justin Evankovich, 4th year EE major
      Nicolas Muggio, 4th year Software Engineering major
      Antony Duda, 4th year CE major
    • MTU Purple, region rank 16
      Michael Lay, 3rd year Software Engineering major
      Marcus Stojcevich, 3rd year CS major
      Parker Russcher, 3rd year CS major

    Two other teams, MTU Orange – Evan de Jesus, Paul Wrubel, Dylan Gaines and MTU – Black – Isaac Smith, Austin Walhof, Ryan Philipps, finished in the top 50 teams of the region.

    Congratulations to all participants in this year’s event.
    ~Laura Brown, Associate Professor, Computer Science

    Microdevice for Rapid Blood Typing without Reagents and Hematocrit Determination – STTR: Phase II

    Laura BrownMichigan Tech Associate Professor Laura Brown (co-PI) and Robert Minerick (PI) of Microdevice Engineering, Inc. were granted a new award funded by the National Science Foundation regarding the broader impact/commercial potential of development of a portable, low cost blood typing and anemia screening device for use in blood donation centers, hospitals, humanitarian efforts and the military.

    This device provides the ability to pre-screen donors by blood type and selectively direct the donation process (i.e. plasma, red cells) to reduce blood product waste and better match supply with hospital demand. This portable technology could also be translated to remote geographical locations for disaster relief applications.

    The proposed project will advance knowledge across multiple fields including: microfluidics and the use of electric fields to characterize cells to identify the molecular expression on blood cells responsible for ABO-Rh blood type and rapidly measure cell concentration. This project includes the development of software for real time tracking of cell population motion and adapts advanced pattern recognition tools like machine learning and statistical analysis for identification of features and prediction of blood types.


    Power Grids and People

    Today’s infrastructure is connected in ways not always known until problems like extreme weather, diseases, major accidents, terror, or cyber threats arise.

    Say fuel delivery will be delayed. What can be done?

    Sixteen critical infrastructure sectors—including water, gas, energy, communications, and transportation—are linked and interdependent. The National Science Foundation is supporting new fundamental research to transform infrastructure from physical structures to responsive systems. The Critical Resilient Interdependent Infrastructure Systems and Processes (CRISP) program supports a collaborative project for Laura Brown, along with Wayne Weaver, and Chee-Wooi Ten, associate professors of electrical and computer engineering at Michigan Tech, and colleagues from the University of New Mexico, Texas Tech University, University of Tennessee–Knoxville, and Fraunhofer USA Center for Sustainable Energy Systems.

    11

    Motivated by distributed renewable resources like solar panels and wind turbines, Brown and her research partners seek to ensure the resiliency of three interdependent networks: the electrical grid, telecommunications, and related socio-economic behavior. The team will look at how people react to power management in extreme conditions. Understanding and modeling human responses is necessary in the design of intelligent systems and programs embedded in devices that control and consume power.


    Tiny Microgrids, Fiercely Important

    A microgrid is a standalone power grid requiring generation capabilities (often generators, batteries, or renewable resources) plus control methods to maintain power flow. Electronics, appliances, and heating or cooling are all responsible for consuming that power. In this project, Laura Brown and other Michigan Tech researchers are investigating a control system for such microgrids that are autonomous—able to work in isolation—and agile, flexible to rapid changes in the configuration of the electric grid to incoming sources and consumers of power.

    10

    The world of microgrids is layered, each layer with a different purpose and speed. For stable power, the controls for the microgrid are considered hierarchically: low-level control responds to fastest events, and maintains regulation of stable voltages and currents in the system; the upper layer of control is responsible for power distribution, optimization, and long-term planning and prediction of resource availability and use. Brown’s work focuses on this high-level analysis in resource prediction at several timescales—in the next few minutes, next hours, next days. What if a generator is out of service for maintenance—what can be done? Brown uses artificial intelligence, machine learning, and experts in other domains to turn off non-critical resources or add new power sources.

    The United States Department of Defense and the Army Research Lab seek the expertise of interdisciplinary Michigan Tech researchers to solve, prevent, and adapt to these potential real-world scenarios.


    Transfer Learning in Data Centers

    Faster apps. More memory. Laura Brown and Zhenlin Wang bring efficiency to Big Data.

    What memory resources will be available if applications A, B, and C all run together?

    Big companies like Amazon and Google have even bigger data centers. Think 30 data centers each with 50,000 to 80,000 servers. And the underlying computer processors are not all identical; each year new improvements are integrated and added. Brown, Wang, and computer science colleagues from Western Michigan University are digging deep into the management of memory resources in these larger-than-life data centers.

    The researchers use machine-learning techniques to create models that predict the cache and memory requirements of an application.

    8

    The challenge is how to make accurate predictions with such a massive variety of applications using the data center, and the different computers the application runs on. Applications might include Netflix streaming a movie, Airbnb running database queries, or NASA processing satellite images. Each app is not run in isolation with a dedicated machine. To maximize resources, data centers may have two or more applications all running on a single machine.

    “If we learn the memory requirements of application A on computer X, what if the same app runs on machine Y or machine Z? Or, what memory resources will be available if A, B, and C all run together?” Brown asks.