Category: Uncategorized

  • Tom Wolf PhD ’81 receives the 2023 Robert A. Muh Alumni Award

    <p>The MIT School of Humanities, Arts, and Social Sciences (SHASS) has announced that former Pennsylvania Governor Tom Wolf PhD ’81 has been recognized with the 2023 <a href=”https://shass.mit.edu/about/honors/robert-muh-alumni-award”>Robert A. Muh Alumni Award</a>.</p>

    <p>The biennial Muh Alumni Award recognizes the tremendous achievements of MIT degree holders who are leaders in one of the Institute’s humanities, arts, or social science fields. The prize was founded in 2000 by Robert Muh ’59 and his wife Berit, on the occasion of the school’s 50th anniversary. This year’s award recognizes Wolf’s distinguished political career.&nbsp;</p>

    <p>Wolf is a 1981 graduate of MIT’s PhD program in the Department of Political Science. He also earned a graduate degree from the University of London and an undergraduate degree from Dartmouth College.</p>

    <p>“Governor Wolf has dedicated his life to public service and improving the lives of Pennsylvanians during his time in office,” says Agustin Rayo, the Kenan Sahin Dean of SHASS. “The Muh Award is a tribute to one of our most distinguished alumni.”</p>

    <p>Wolf will deliver a Muh Award Lecture on Tuesday, March 21, titled “Collective Action: The Essence of Politics.” The event begins at 5 p.m. in the MIT Samberg Conference Center, 6th floor of the Chang Building. The event is free and open to the public.&nbsp;</p>

    <p>After serving as Pennsylvania’s secretary of revenue from 2007 to 2008, Wolf was elected as the state’s 47th governor. He served two terms, holding office from January 2015 until January 2023. During his time as governor, Wolf prioritized investing more in education, expanding access to health care, reforming the criminal justice system, and protecting the environment. Wolf led the state through a period of tumultuous political division, as well as the global Covid-19 pandemic.</p>

    <p>Wolf returned to the MIT campus last fall, when he <a href=”https://news.mit.edu/2022/tom-wolf-manufacturing-jobs-1017″>delivered a talk</a> as part of the Manufacturing@MIT distinguished speaker series. Wolf discussed reviving American manufacturing and focused on an outline of the policy approach he used in his own state.&nbsp;</p>

    <p>Prior to becoming governor, Wolf was owner of the Wolf Organization, a distributor of lumber and other building products. Wolf bought the family business when it was on the brink of bankruptcy, overseeing the eventual growth and transformation of the company. Beyond politics, Wolf has served on and led the boards of numerous organizations. He also interrupted his undergraduate studies at Dartmouth to serve in the Peace Corps, spending two years in a small village in India.</p>

  • Report: CHIPS Act just the first step in addressing threats to US leadership in advanced computing

    <p>When Liu He, a Chinese economist, politician, and “chip czar,” was tapped to lead the charge in a chipmaking arms race with the United States, his message lingered in the air, leaving behind a dewy glaze of tension: “For our country, technology is not just for growth… it is a matter of survival.”</p>

    <p>Once upon a time, the United States’ early technological prowess positioned the nation to outpace foreign rivals and cultivate a competitive advantage for domestic businesses. Yet, 30 years later, America’s lead in advanced computing is continuing to wane. What happened?</p>

    <p>A <a href=”https://gppreview.com/2023/02/28/americas-lead-in-advanced-computing-is-almost-gonepart-1-systems-and-capabilities/” target=”_blank”>new report</a> from an MIT researcher and two colleagues sheds light on the decline in U.S. leadership. The scientists looked at high-level measures to examine the shrinkage: overall capabilities, supercomputers, applied algorithms, and semiconductor manufacturing. Through their analysis, they found that not only has China closed the computing gap with the U.S., but nearly 80 percent of American leaders in the field believe that their Chinese competitors are improving capabilities faster — which, the team says, suggests a “broad threat to U.S. competitiveness.”</p>

    <p>To delve deeply into the fray, the scientists conducted the Advanced Computing Users Survey, sampling 120 top-tier organizations, including universities, national labs, federal agencies, and industry. The team estimates that this group comprises one-third and one-half of all the most significant computing users in the United States.</p>

    <p>“Advanced computing is crucial to scientific improvement, economic growth and the competitiveness of U.S. companies,” says Neil Thompson, director of the FutureTech Research Project at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), who helped lead the study.</p>

    <p>Thompson, who is also a principal investigator at MIT’s Initiative on the Digital Economy, wrote the paper with Chad Evans, executive vice president and secretary and treasurer to the board at the Council on Competitiveness, and Daniel Armbrust, who is the co-founder, initial CEO, and member of the board of directors at Silicon Catalyst and former president of SEMATECH, the semiconductor consortium that developed industry roadmaps.</p>

    <p><strong>The semiconductor, supercomputer, and algorithm bonanza</strong></p>

    <p>Supercomputers — the room-sized, “giant calculators” of the hardware world — are an industry no longer dominated by the United States. Through 2015, about half of the most powerful computers were sitting firmly in the U.S., and China was growing slowly from a very slow base. But in the past six years, China has swiftly caught up, reaching near parity with America.</p>

    <p>This disappearing lead matters. Eighty-four percent of U.S. survey respondents said they’re computationally constrained in running essential programs. “This result was telling, given who our respondents are: the vanguard of American research enterprises and academic institutions with privileged access to advanced national supercomputing resources,” says Thompson.&nbsp;</p>

    <p>With regards to advanced algorithms, historically, the U.S. has fronted the charge, with two-thirds of all significant improvements dominated by U.S.-born inventors. But in recent decades, U.S. dominance in algorithms has relied on bringing in foreign talent to work in the U.S., which the researchers say is now in jeopardy. China has outpaced the U.S. and many other countries in churning out PhDs in STEM fields since 2007, with one report postulating a near-distant future (2025) where China will be home to nearly twice as many PhDs than in the U.S. China’s rise in algorithms can also be seen with the “Gordon Bell Prize,” an achievement for outstanding work in harnessing the power of supercomputers in varied applications. U.S. winners historically dominated the prize, but China has now equaled or surpassed Americans’ performance in the past five years.</p>

    <p>While the researchers note the CHIPS and Science Act of 2022 is a critical step in re-establishing the foundation of success for advanced computing, they propose recommendations to the U.S. Office of Science and Technology Policy.&nbsp;</p>

    <p>First, they suggest democratizing access to U.S. supercomputing by building more mid-tier systems that push boundaries for many users, as well as building tools so users scaling up computations can have less up-front resource investment. They also recommend increasing the pool of innovators by funding many more electrical engineers and computer scientists being trained with longer-term US residency incentives and scholarships. Finally, in addition to this new framework, the scientists urge taking advantage of what already exists, via providing the private sector access to experimentation with high-performance computing through supercomputing sites in academia and national labs.</p>

    <p><strong>All that and a bag of chips</strong></p>

    <p>Computing improvements depend on continuous advances in transistor density and performance, but creating robust, new chips necessitate a harmonious blend of design and manufacturing.</p>

    <p>Over the last six years, China was not known as the savants of noteworthy chips. In fact, in the past five decades, the U.S. designed most of them. But this changed in the past six years when China created the HiSilicon Kirin 9000, propelling itself to the international frontier. This success was mainly obtained through partnerships with leading global chip designers that began in the 2000s. Now, China now has 14 companies among the world’s top 50 fabless designers. A decade ago, there was only one.&nbsp;</p>

    <p>Competitive semiconductor manufacturing has been more mixed, where U.S.-led policies and internal execution issues have slowed China’s rise, but as of July 2022, the Semiconductor Manufacturing International Corporation (SMIC) has evidence of 7 nanometer logic, which was not expected until much later. However, with extreme ultraviolet export restrictions, progress below 7 nm means domestic technology development would be expensive. Currently, China is only at parity or better in two out of 12 segments of the semiconductor supply chain. Still, with government policy and investments, the team expects a whopping increase to seven segments in 10 years. So, for the moment, the U.S. retains leadership in hardware manufacturing, but with fewer dimensions of advantage.</p>

    <p>The authors recommend that the White House Office of Science and Technology Policy work with key national agencies, such as the U.S. Department of Defense, U.S. Department of Energy, and the National Science Foundation, to define initiatives to build the hardware and software systems needed for important computing paradigms and workloads critical for economic and security goals. “It is crucial that American enterprises can get the benefit of faster computers,” says Thompson. “With Moore’s Law slowing down, the best way to do this is to create a portfolio of specialized chips (or “accelerators”) that are customized to our needs.”</p>

    <p>The scientists further believe that to lead the next generation of computing, four areas must be addressed. First, by issuing grand challenges to the CHIPS Act National Semiconductor Technology Center, researchers and startups would be motivated to invest in research and development and to seek startup capital for new technologies in areas such as spintronics, neuromorphics, optical and quantum computing, and optical interconnect fabrics. By supporting allies in passing similar acts, overall investment in these technologies would increase, and supply chains would become more aligned and secure. Establishing test beds for researchers to test algorithms on new computing architectures and hardware would provide an essential platform for innovation and discovery. Finally, planning for post-exascale systems that achieve higher levels of performance through next-generation advances would ensure that current commercial technologies don’t limit future computing systems.</p>

    <p>“The advanced computing landscape is in rapid flux — technologically, economically, and politically, with both new opportunities for innovation and rising global rivalries,” says Daniel Reed, Presidential Professor and professor of computer science and electrical and computer engineering at the University of Utah. “The transformational insights from both deep learning and computational modeling depend on both continued semiconductor advances and their instantiation in leading edge, large-scale computing systems — hyperscale clouds and high-performance computing systems. Although the U.S. has historically led the world in both advanced semiconductors and high-performance computing, other nations have recognized that these capabilities are integral to 21st century economic competitiveness and national security, and they are investing heavily.”</p>

    <p>The research was funded, in part, through Thompson’s grant from Good Ventures, which supports his FutureTech Research Group. The <a href=”https://gppreview.com/2023/02/28/americas-lead-in-advanced-computing-is-almost-gonepart-1-systems-and-capabilities/” target=”_blank”>paper</a> is being published by the <em>Georgetown Public Policy Review.</em></p>

  • Using combustion to make better batteries

    <p>For more than a century, much of the world has run on the combustion of fossil fuels. Now, to avert the threat of climate change, the energy system is changing. Notably, solar and wind systems are replacing fossil fuel combustion for generating electricity and heat, and batteries are replacing the internal combustion engine for powering vehicles. As the energy transition progresses, researchers worldwide are tackling the many challenges that arise.</p>

    <p>Sili Deng has spent her career thinking about combustion. Now an assistant professor in the MIT Department of Mechanical Engineering and the Class of 1954 Career Development Professor, Deng leads a group that, among other things, develops theoretical models to help understand and control combustion systems to make them more efficient and to control the formation of emissions, including particles of soot.</p>

    <p>“So we thought, given our background in combustion, what’s the best way we can contribute to the energy transition?” says Deng. In considering the possibilities, she notes that combustion refers only to the process — not to what’s burning. “While we generally think of fossil fuels when we think of combustion, the term ‘combustion’ encompasses many high-temperature chemical reactions that involve oxygen and typically emit light and large amounts of heat,” she says.</p>

    <p>Given that definition, she saw another role for the expertise she and her team have developed: They could explore the use of combustion to make materials for the energy transition. Under carefully controlled conditions, combusting flames can be used to produce not polluting soot, but rather valuable materials, including some that are critical in the manufacture of lithium-ion batteries.</p>

    <p><strong>Improving the lithium-ion battery by lowering costs</strong></p>

    <p>The demand for lithium-ion batteries is projected to skyrocket in the coming decades. Batteries will be needed to power the growing fleet of electric cars and to store the electricity produced by solar and wind systems so it can be delivered later when those sources aren’t generating. Some experts project that the global demand for lithium-ion batteries may increase tenfold or more in the next decade.</p>

    <p>Given such projections, many researchers are looking for ways to improve the lithium-ion battery technology. Deng and her group aren’t materials scientists, so they don’t focus on making new and better battery chemistries. Instead, their goal is to find a way to lower the high cost of making all of those batteries. And much of the cost of making a lithium-ion battery can be traced to the manufacture of materials used to make one of its two electrodes — the cathode.</p>

    <p>The MIT researchers began their search for cost savings by considering the methods now used to produce cathode materials. The raw materials are typically salts of several metals, including lithium, which provides ions — the electrically charged particles that move when the battery is charged and discharged. The processing technology aims to produce tiny particles, each one made up of a mixture of those ingredients, with the atoms arranged in the specific crystalline structure that will deliver the best performance in the finished battery.</p>

    <p>For the past several decades, companies have manufactured those cathode materials using a two-stage process called coprecipitation. In the first stage, the metal salts — excluding the lithium — are dissolved in water and thoroughly mixed inside a chemical reactor. Chemicals are added to change the acidity (the pH) of the mixture, and particles made up of the combined salts precipitate out of the solution. The particles are then removed, dried, ground up, and put through a sieve.</p>

    <p>A change in pH won’t cause lithium to precipitate, so it is added in the second stage. Solid lithium is ground together with the particles from the first stage until lithium atoms permeate the particles. The resulting material is then heated, or “annealed,” to ensure complete mixing and to achieve the targeted crystalline structure. Finally, the particles go through a “deagglomerator” that separates any particles that have joined together, and the cathode material emerges.</p>

    <p>Coprecipitation produces the needed materials, but the process is time-consuming. The first stage takes about 10 hours, and the second stage requires about 13 hours of annealing at a relatively low temperature (750 degrees Celsius). In addition, to prevent cracking during annealing, the temperature is gradually “ramped” up and down, which takes another 11 hours. The process is thus not only time-consuming but also energy-intensive and costly.</p>

    <p>For the past two years, Deng and her group have been exploring better ways to make the cathode material. “Combustion is very effective at oxidizing things, and the materials for lithium-ion batteries are generally mixtures of metal oxides,” says Deng. That being the case, they thought this could be an opportunity to use a combustion-based process called flame synthesis.</p>

    <p><strong>A new way of making a high-performance cathode material</strong></p>

    <p>The first task for Deng and her team — mechanical engineering postdoc Jianan Zhang, Valerie L. Muldoon ’20, SM ’22, and current graduate students Maanasa Bhat and Chuwei Zhang — was to choose a target material for their study. They decided to focus on a mixture of metal oxides consisting of nickel, cobalt, and manganese plus lithium. Known as “NCM811,” this material is widely used and has been shown to produce cathodes for batteries that deliver high performance; in an electric vehicle, that means a long driving range, rapid discharge and recharge, and a long lifetime. To better define their target, the researchers examined the literature to determine the composition and crystalline structure of NCM811 that has been shown to deliver the best performance as a cathode material.</p>

    <p>They then considered three possible approaches to improving on the coprecipitation process for synthesizing NCM811: They could simplify the system (to cut capital costs), speed up the process, or cut the energy required.</p>

    <p>“Our first thought was, what if we can mix together all of the substances — including the lithium — at the beginning?” says Deng. “Then we would not need to have the two stages” — a clear simplification over coprecipitation.</p>

    <p><strong>Introducing FASP</strong></p>

    <p>One process widely used in the chemical and other industries to fabricate nanoparticles is a type of flame synthesis called flame-assisted spray pyrolysis, or FASP. Deng’s <a href=”https://www.sciencedirect.com/science/article/abs/pii/S0378775322002622?via%3Dihub”>concept</a> for using FASP to make their targeted cathode powders proceeds as follows.</p>

    <p>The precursor materials — the metal salts (including the lithium) — are mixed with water, and the resulting solution is sprayed as fine droplets by an atomizer into a combustion chamber. There, a flame of burning methane heats up the mixture. The water evaporates, leaving the precursor materials to decompose, oxidize, and solidify to form the powder product. The cyclone separates particles of different sizes, and the baghouse filters out those that aren’t useful. The collected particles would then be annealed and deagglomerated.</p>

    <p>To investigate and optimize this concept, the researchers developed a lab-scale FASP setup consisting of a homemade ultrasonic nebulizer, a preheating section, a burner, a filter, and a vacuum pump that withdraws the powders that form. Using that system, they could control the details of the heating process: The preheating section replicates conditions as the material first enters the combustion chamber, and the burner replicates conditions as it passes the flame. That setup allowed the team to explore operating conditions that would give the best results.</p>

    <p>Their experiments showed marked benefits over coprecipitation. The nebulizer breaks up the liquid solution into fine droplets, ensuring atomic-level mixing. The water simply evaporates, so there’s no need to change the pH or to separate the solids from a liquid. As Deng notes, “You just let the gas go, and you’re left with the particles, which is what you want.” With lithium included at the outset, there’s no need for mixing solids with solids, which is neither efficient 
nor effective.</p>

    <p>They could even control the structure, or “morphology,” of the particles that formed. In one series of experiments, they tried exposing the incoming spray to different rates of temperature change over time. They found that the temperature “history” has a direct impact on morphology. With no preheating, the particles burst apart; and with rapid preheating, the particles were hollow. The best outcomes came when they used temperatures ranging from 175-225 C. Experiments with coin-cell batteries (laboratory devices used for testing battery materials) confirmed that by adjusting the preheating temperature, they could achieve a particle morphology that would optimize the performance of their materials.</p>

    <p>Best of all, the particles formed in seconds. Assuming the time needed for conventional annealing and deagglomerating, the new setup could synthesize the finished cathode material in half the total time needed for coprecipitation. Moreover, the first stage of the coprecipitation system is replaced by a far simpler setup — a savings in capital costs.</p>

    <p>“We were very happy,” says Deng. “But then we thought, if we’ve changed the precursor side so the lithium is mixed well with the salts, do we need to have the same process for the second stage? Maybe not!”</p>

    <p><strong>Improving the second stage</strong></p>

    <p>The key time- and energy-consuming step in the second stage is the annealing. In today’s coprecipitation process, the strategy is to anneal at a low temperature for a long time, giving the operator time to manipulate and control the process. But running a furnace for some 20 hours — even at a low temperature — consumes a lot of energy.</p>

    <p>Based on their studies thus far, Deng thought, “What if we slightly increase the temperature but reduce the annealing time by orders of magnitude? Then we could cut energy consumption, and we might still achieve the desired crystal structure.”</p>

    <p>However, experiments at slightly elevated temperatures and short treatment times didn’t bring the results they had hoped for. In transmission electron microscope (TEM) images, the particles that formed had clouds of light-looking nanoscale particles attached to their surfaces. When the researchers performed the same experiments without adding the lithium, those nanoparticles didn’t appear. Based on that and other tests, they concluded that the nanoparticles were pure lithium. So, it seemed like long-duration annealing would be needed to ensure that the lithium made its way inside the particles.</p>

    <p>But they then came up with a different solution to the lithium-distribution problem. They added a small amount — just 1 percent by weight — of an inexpensive compound called urea to their mixture. In TEM images of the particles formed, the “undesirable nanoparticles were largely gone,” says Deng.</p>

    <p>Experiments in the laboratory coin cells showed that the addition of urea significantly altered the response to changes in the annealing temperature. When the urea was absent, raising the annealing temperature led to a dramatic decline in performance of the cathode material that formed. But with the urea present, the performance of the material that formed was unaffected by any temperature change.</p>

    <p>That result meant that — as long as the urea was added with the other precursors — they could push up the temperature, shrink the annealing time, and omit the gradual ramp-up and cool-down process. Further imaging studies confirmed that their approach yields the desired crystal structure and the homogeneous elemental distribution of the cobalt, nickel, manganese, and lithium within the particles. Moreover, in tests of various performance measures, their materials did as well as materials produced by coprecipitation or by other methods using long-time heat treatment. Indeed, the performance was comparable to that of commercial batteries with cathodes made of NCM811.</p>

    <p>So now the long and expensive second stage required in standard coprecipitation could be replaced by just 20 minutes of annealing at about 870 C plus 20 minutes of cooling down at room temperature.</p>

    <p><strong>Theory, continuing work, and planning for scale-up</strong></p>

    <p>While experimental evidence supports their approach, Deng and her group are now working to understand why it works. “Getting the underlying physics right will help us design the process to control the morphology and to scale up the process,” says Deng. And they have a hypothesis for why the lithium nanoparticles in their flame synthesis process end up on the surfaces of the larger particles — and why the presence of urea solves that problem.</p>

    <p>According to their theory, without the added urea, the metal and lithium atoms are initially well-mixed within the droplet. But as heating progresses, the lithium diffuses to the surface and ends up as nanoparticles attached to the solidified particle. As a result, a long annealing process is needed to move the lithium in among the other atoms.</p>

    <p>When the urea is present, it starts out mixed with the lithium and other atoms inside the droplet. As temperatures rise, the urea decomposes, forming bubbles. As heating progresses, the bubbles burst, increasing circulation, which keeps the lithium from diffusing to the surface. The lithium ends up uniformly distributed, so the final heat treatment can be very short.</p>

    <p>The researchers are now designing a system to suspend a droplet of their mixture so they can observe the circulation inside it, with and without the urea present. They’re also developing experiments to examine how droplets vaporize, employing tools and methods they have used in the past to study how hydrocarbons vaporize inside internal combustion engines.</p>

    <p>They also have ideas about how to streamline and scale up their process. In coprecipitation, the first stage takes 10 to 20 hours, so one batch at a time moves on to the second stage to be annealed. In contrast, the novel FASP process generates particles in 20 minutes or less — a rate that’s consistent with continuous processing. In their design for an “integrated synthesis system,” the particles coming out of the baghouse are deposited on a belt that carries them for 10 or 20 minutes through a furnace. A deagglomerator then breaks any attached particles apart, and the cathode powder emerges, ready to be fabricated into a high-performance cathode for a lithium-ion battery. The cathode powders for high-performance lithium-ion batteries would thus be manufactured at unprecedented speed, low cost, and low energy use.</p>

    <p>Deng notes that every component in their integrated system is already used in industry, generally at a large scale and high flow-through rate. “That’s why we see great potential for our technology to be commercialized and scaled up,” she says. “Where our expertise comes into play is in designing the combustion chamber to control the temperature and heating rate so as to produce particles with the desired morphology.” And while a detailed economic analysis has yet to be performed, it seems clear that their technique will be faster, the equipment simpler, and the energy use lower than other methods of manufacturing cathode materials for lithium-ion batteries — potentially a major contribution to the ongoing energy transition.</p>

    <p>This research was supported by the MIT Department of Mechanical Engineering.</p>

    <p><em>This article appears in the&nbsp;</em><a href=”https://energy.mit.edu/energy-futures/winter-2023/”><em>Winter 2023</em></a><em>&nbsp;issue of&nbsp;</em>Energy Futures<em>, the magazine of the MIT Energy Initiative.</em></p>

  • Preparing students for the new nuclear

    <p>As nuclear power has gained greater recognition as a zero-emission energy source, the MIT Leaders for Global Operations (LGO) program has taken notice.</p>

    <p>Two years ago, LGO began a collaboration with MIT’s Department of Nuclear Science and Engineering (NSE) as a way to showcase the vital contribution of both business savvy and scientific rigor that LGO’s dual-degree graduates can offer this growing field.</p>

    <p>“We saw that the future of fission and fusion required business acumen and management acumen,” says Professor Anne White, NSE department head. “People who are going to be leaders in our discipline, and leaders in the nuclear enterprise, are going to need all of the technical pieces of the puzzle that our engineering department can provide in terms of education and training. But they’re also going to need a much broader perspective on how the technology connects with society through the lens of business.”</p>

    <p>The resulting response has been positive: “Companies are seeing the value of nuclear technology for their operations,” White says, and this often happens in unexpected ways.</p>

    <p>For example, graduate student Santiago Andrade recently completed a research project at Caterpillar Inc., a preeminent manufacturer of mining and construction equipment. Caterpillar is one of more than 20 major companies that partner with the LGO program, offering six-month internships to each student. On the surface, it seemed like an improbable pairing; what could&nbsp;Andrade, who was pursuing his master’s in nuclear science and engineering, do for a manufacturing company? However, Caterpillar wanted to understand the technical and commercial feasibility of using nuclear energy to power mining sites and data centers when wind and solar weren’t viable.</p>

    <p>“They are leaving no stone unturned in the search of financially smart solutions that can support the transition to a clean energy dependency,” Andrade says. “My project, along with many others’, is part of this effort.”</p>

    <p>“The research done through the LGO program with Santiago is enabling Caterpillar to understand how alternative technologies, like the nuclear microreactor, could participate in these markets in the future,” says Brian George, product manager for large electric power solutions at Caterpillar. “Our ability to connect our customers with the research will provide for a more accurate understanding of the potential opportunity, and helps provide exposure for our customers to emerging technologies.”</p>

    <p>With looming threats of climate change, White says, “We’re going to require more opportunities for nuclear technologies to step in and be part of those solutions. A cohort of LGO graduates will come through this program with technical expertise — a master’s degree in nuclear engineering — and an MBA. There’s going to be a tremendous talent pool out there to help companies and governments.”</p>

    <p>Andrade, who completed an undergraduate degree in chemical engineering and had a strong background in thermodynamics, applied to LGO unsure of which track to choose, but he knew he wanted to confront the world’s energy challenge. When MIT Admissions suggested that he join LGO’s new nuclear track, he was intrigued by how it could further his career.</p>

    <p>“Since the NSE department offers opportunities ranging from energy to health care and from quantum engineering to regulatory policy, the possibilities of career tracks after graduation are countless,” he says.</p>

    <p>He was also inspired by the fact that, as he says, “Nuclear is one of the less-popular solutions in terms of our energy transition journey. One of the things that attracted me is that it’s not one of the most popular, but it’s one of the most useful.”</p>

    <p>In addition to his work at Caterpillar, Andrade connected deeply with professors. He worked closely with professors Jacopo Buongiorno and John Parsons as a research assistant, helping them develop a business model to successfully support the deployment of nuclear microreactors. After graduation, he plans to work in the clean energy sector with an eye to innovations in the nuclear energy technology space.</p>

    <p>His LGO classmate, Lindsey Kennington, a control systems engineer, echoes his sentiments: This is a revolutionary time for nuclear technology.</p>

    <p>“Before MIT, I worked on a lot of nuclear waste or nuclear weapons-related projects. All of them were fission-related. I got disillusioned because of all the bureaucracy and the regulation,” Kennington says. “However, now there are a lot of new nuclear technologies coming straight out of MIT. Commonwealth Fusion Systems, a fusion startup, represents a prime example of MIT’s close relationship to new nuclear tech. Small modular reactors are another emerging technology being developed by MIT. Exposure to these cutting-edge technologies was the main sell factor for me.”</p>

    <p>Kennington conducted an internship with National Grid, where she used her expertise to evaluate how existing nuclear power plants could generate hydrogen. At MIT, she studied nuclear and energy policy, which offered her additional perspective that traditional engineering classes might not have provided. Because nuclear power has long been a hot-button issue, Kennington was able to gain nuanced insight about the pathways and roadblocks to its implementation.</p>

    <p>“I don’t think that other engineering departments emphasize that focus on policy quite as much. [Those classes] have been one of the most enriching parts of being in the nuclear department,” she says.</p>

    <p>Most of all, she says, it’s a pivotal time to be part of a new, blossoming program at the forefront of clean energy, especially as fusion research grows more prevalent.</p>

    <p>“We’re at an inflection point,” she says. “Whether or not we figure out fusion in the next five, 10, or 20 years, people are going to be working on it — and it’s a really exciting time to not only work on the science but to actually help the funding and business side grow.”</p>

    <p>White puts it simply.</p>

    <p>“This is not your parents’ nuclear,” she says. “It’s something totally different. Our discipline is evolving so rapidly that people who have technical expertise in nuclear will have a huge advantage in this next generation.”</p>

  • Wiring the organization for exceptional performance

    <p>Steven Spear SM ’93 analyzes the framework of relationships and interactions by which an organization runs, to better harness the intellectual horsepower distributed throughout the enterprise.</p>

    <p>In almost any industry, some few companies dramatically outperform their peers and near-peers, generating and delivering far more value to society by getting far more yield out of the resources and opportunities available to them, says Spear. These companies do so because they far better manage the organization’s “social circuitry” — the overlay of processes, procedures, and routines by which the work of individual specialists is integrated into collective action toward common purpose.</p>

    <p>Getting social circuitry right means far less distraction for people trying to figure out where they fit in and how their efforts must coordinate with others, leaving much more of their intellectual horsepower to focus on hard technical and social problems, Spear says.</p>

    <p>”The very best enterprises give a lot of attention and have uncommon skill in creating the conditions in which all brains can be engaged consistently on the hard technical and social problems in front of them, without the huge overburdens of communication, coordination, and the like that too often suck up those cognitive resources. As a result, the very best predictably outperform those who did not manage in this way,” says Spear, a senior lecturer at the MIT Sloan School of Management and principal of the See to Solve software and management advisory consulting firm in Newton, Massachusetts.</p>
    <p>In one recent example, Spear worked with scientists at a pharmaceutical company to harmonize efforts in an early-stage phase of drug development called hit-to-lead: taking molecules that have been identified by high-throughput screening as potentially effective treatments and figuring out how to turn them into bona fide candidate compounds warranting further development.</p>

    <p>“By mapping better who should be in conversation with whom, about what, when, in what fashion, we were able to get that flow of ideas from gestation through maturation to delivery, a process that typically took over a year, and get it accomplished under six months,” Spear says. “And the candidates passed on as leads for development were better.”</p>

    <p>A similar earlier effort, described in Spear’s book “The High Velocity Edge,” helped Pratt &amp; Whitney cut by a quarter its jet engine development time while simultaneously delivering unmatched performance. After losing a series of contests for both commercial and military programs, the company won the contract for the F-35, a fifth-generation fighter jet for the U.S. Air Force, Navy, and Marine Corps.</p>

    <p>This emphasis on getting right the social circuitry of an organization is so critical because all work is knowledge work, even in the most industrial and technology-intensive situations, he emphasizes. “The job of leaders is in part to create conditions in which everyone can give fullest expression to their knowledge and skills. This whole engagement of everyone’s brain — individually and in collaboration — is so critically important because in most of the problems we encounter, across any societal or economic vertical and across any phase of value creation, we’re engaged in a process of discovery,” Spear stresses. “We have to figure out what problem we really need to solve, why it’s actually occurring, what corrective actions will address it effectively, and how to bring those ideas into action.”</p>

    <p><strong>Cutting the cost of coordination</strong></p>

    <p>Organizations that fail to continuously develop, deploy, and constantly fix the social overlay face grave consequences, says Spear. People end up stumbling on tasks when they don’t know how to reach out quickly and effectively to their counterparts who can respond with needed information, insights, or decisions. Instead, getting things done too often requires those in front-line work to reach up five, six, or seven layers of administration, hoping that those at the top will cut across to some other silo and then reach down several layers to effect change.</p>

    <p>This was exactly the problem engineers faced at Pratt &amp; Whitney, before the changes affected during its pilot — having to escalate any problem to the most senior engineers before getting resolution, Spear says. This issue impacts everyone, be they mechanics in large-scale industrial operations, clinicians in medical practices, or software engineers in the most “cutting-edge” tech firms.</p>

    <p>Economists have a term for this problem: the cost of coordination. Practitioners in such situations have a less esoteric term, says Spear: It’s an [expletive] headache. The cost can be huge in terms of aggravation, time, and lost creativity, and the cost grows steadily with the ultra-fast evolution of technologies and markets and the increased complexities of the systems we have to design, deploy, and operate, he says.</p>
    <p>His fascination with the ways in which managers and leaders can create the conditions in which individual employees can give much fuller expression to their intellectual potential goes back to Spear’s arrival at MIT, where he graduated in 1993 with master’s degrees in management and in mechanical engineering.</p>

    <p>“American industry was getting clobbered by Japanese companies,” he remembers, “and there was this Institute-wide commitment to figure out why and do something about it.” MIT launched several efforts under the Industrial Liaison Program, the Japan Program, Leaders for Global Operations, and other groups aimed to better understand the challenging work of managing people responsible for inventing, producing, and utilizing complex technical systems.</p>

    <p>“What kept coming up was that the competitive advantage wasn’t gained in a first-order fashion from the engineered objects on which people worked — those were ultimately outputs people generated,” Spear says. “Nor was long-term advantage contained in the technical equipment through which people worked — that too was the output of a collaborative design and development experience. Instead, advantage was the result of the management systems by which those tools and those objects were invented, with innovative ideas created about their best possible use. The interaction of social systems overlaid on and generating technical systems was (and is) the key.”</p>

    <p><strong>Clearing the way for discovery</strong></p>

    <p>Spear’s client engagements follow the MIT ethos for discovery and exploration. “We take quick, small, non-disruptive steps first, to get feedback that’s fast, that’s frequent, that has high fidelity to the operating environment, that has high accuracy and that’s easy to interpret,” he says. “When we get confidence from iterating at one scale, we can scale up to the next. In effect, we try to bring the discipline and rigor of laboratory sciences into the field of social systems.”</p>

    <p>Starting with the pilot, Spear works with his clients to dive deep into the nitty-gritty details of the individual component functions that must be performed in order to meet the given goal. “Then we start figuring out the dependencies amongst these different tasks, the natural sequencing of work, out of which the enterprise’s ‘architecture’ emerges, defined not only by role and responsibilities, but also by relationships.”</p>

    <p>That part is critical. Most organizations, he says, give too little attention to relationships and the practical questions of who depends on whom to get work done, and how those interdependencies should be managed in terms of work flow, communication, and coordination.</p>

    <p>“Once we start getting into these questions of roles, responsibilities and especially relationships, then we can start to see what the circuitry looks like,” Spear says. “And then, once it emerges out of this recursive identification of interdependencies, we try to formalize the circuitry, so people are in the right collaborative, creative conversations with the right other people. And if the circuitry isn’t working, and coordination and collaboration become viscous again, we rewire it.”</p>

    <p>Spear has proven this approach works in many high-payoff engagements with manufacturers, military organizations, and dozens of other large enterprises. “The time and energy required to coordinate goes way down,” he says. “How your work and my work fit together becomes natural, obvious, and intuitive. And we can use our creative energy to focus on the problems for which we assembled in the first place to solve.”</p>

    <p>After 30-some years, Spear’s enthusiasm about creating outstanding management systems is greater than ever. “Whatever people can contribute towards common purpose, all of those individual contributions can harmonize and integrate in such beautiful fashion to generate so much more value that can be delivered into society than otherwise would be the case,” he remarks. “Managers have this huge opportunity to provide their colleagues with much richer, more rewarding experiences each and every day and, in doing so, contribute to society’s well-being. Who wouldn’t want to get out of bed in the morning with that as a possibility?”</p>

  • Hello world!

    Welcome to your brand new site at PredragCampusPress Sitesamp;#039;s Demo Network.

    To get started, simply log in, edit or delete this post and check out all the other options available to you.