Grading the Digital School

Mooresville’s Shining Example (It’s not just about the laptops)

By , NY Times

Published: February 12, 2012

A version of this article appeared in print on February 13, 2012, on pageA10 of the New York edition.


MOORESVILLE, N.C. — Sixty educators from across the nation roamed the halls and ringed the rooms of East Mooresville Intermediate School, searching for the secret formula. They found it in Erin Holsinger’s fifth-grade math class.

There, a boy peering into his school-issued MacBook blitzed through fractions by himself, determined to reach sixth-grade work by winter. Three desks away, a girl was struggling with basic multiplication — only 29 percent right, her screen said — and Ms. Holsinger knelt beside her to assist. Curiosity was fed and embarrassment avoided, as teacher connected with student through emotion far more than Wi-Fi.

“This is not about the technology,” Mark Edwards, superintendent of Mooresville Graded School District, would tell the visitors later over lunch. “It’s not about the box. It’s about changing the culture of instruction — preparing students for their future, not our past.”

As debate continues over whether schools invest wisely in technology — and whether it measurably improves student achievement — Mooresville, a modest community about 20 miles north of Charlotte best known as home to several Nascar teams and drivers, has quietly emerged as the de facto national model of the digital school.

Mr. Edwards spoke on a White House panel in September, and federal Department of Education officials often cite Mooresville as a symbolic success. Overwhelmed by requests to view the programs in action, the district now herds visitors into groups of 60 for monthly demonstrations; the waiting list stretches to April. What they are looking for is an explanation for the steady gains Mooresville has made since issuing laptops three years ago to the 4,400 4th through 12th graders in five schools (three K-3 schools are not part of the program).

The district’s graduation rate was 91 percent in 2011, up from 80 percent in 2008. On state tests in reading, math and science, an average of 88 percent of students across grades and subjects met proficiency standards, compared with 73 percent three years ago. Attendance is up, dropouts are down. Mooresville ranks 100th out of 115 districts in North Carolina in terms of dollars spent per student — $7,415.89 a year — but it is now third in test scores and second in graduation rates.

“Other districts are doing things, but what we see in Mooresville is the whole package: using the budget, innovating, using data, involvement with the community and leadership,” said Karen Cator, a former Apple executive who is director of educational technology for the United States Department of Education. “There are lessons to be learned.”

Start with math lessons: each student’s MacBook Air is leased from Apple for $215 a year, including warranty, for a total of $1 million; an additional $100,000 a year goes for software. Terry Haas, the district’s chief financial officer, said the money was freed up through “incredibly tough decisions.”

Sixty-five jobs were eliminated, including 37 teachers, which resulted in larger class sizes — in middle schools, it is 30 instead of 18 — but district officials say they can be more efficiently managed because of the technology. Some costly items had become obsolete (like computer labs), though getting rid of others tested the willingness of teachers to embrace the new day: who needs globes in the age of Google Earth?

Families pay $50 a year to subsidize computer repairs, though the fee is waived for those who cannot afford it, about 18 percent of them. Similarly, the district has negotiated a deal so that those without broadband Internet access can buy it for $9.99 a month. Mr. Edwards said the technology had helped close racial performance gaps in a district where 27 percent of the students are minorities and 40 percent are poor enough to receive free or reduced-price lunches.

Others see broader economic benefits.

“Even in the downturn, we’re a seller’s market — people want to buy homes here,” said Kent Temple, a real estate agent in town. “Families say, ‘This is a chance for my child to compete with families that have more money than me.’ Six years from now, you’ll see how many from disadvantaged backgrounds go to college and make it.”

Mooresville’s laptops perform the same tasks as those in hundreds of other districts: they correct worksheets, assemble progress data for teachers, allow for compelling multimedia lessons, and let students work at their own pace or in groups, rather than all listening to one teacher. The difference, teachers and administrators here said, is that they value computers not for the newest content they can deliver, but for how they tap into the oldest of student emotions — curiosity, boredom, embarrassment, angst — and help educators deliver what only people can. Technology, here, is cold used to warm.

Mooresville frequently tests students in various subjects to inform teachers where each needs help. Every quarter, department heads and principals present summary data to Mr. Edwards, who uses it to assess where teachers need improvement. Special emphasis goes to identifying students who are only a few correct answers away from passing state proficiency standards. They are then told how close they are and, Mr. Edwards said, “You can, you can, you can.”

Many classrooms have moved from lecture to lattice, where students collaborate in small groups with the teacher swooping in for consultation. Rather than tell her 11th-grade English students the definition of transcendentalism one recent day, Katheryn Higgins had them crowd-source their own — quite Thoreauly, it turned out — using Google Docs. Back in September, Ms. Higgins had the more outgoing students make presentations on the Declaration of Independence, while shy ones discussed it in an online chat room, which she monitored.

“I’m not a very social person, but I have no problem typing on a keyboard,” said one of those shy ones, Chase Wilson. “It connected me with other students — opened me up and helped me with talking in public.”

In math, students used individualized software modules, with teachers stopping by occasionally to answer questions. (“It’s like having a personal tutor,” said Ethan Jones, the fifth grader zooming toward sixth-grade material.) Teachers apportion their time based on the need of students, without the weaker ones having to struggle at the blackboard in front of the class; this dynamic has helped children with learning disabilities to participate and succeed in mainstream classes.

“There are students who might not have graduated five years ago who have graduated,” said Melody Morrison, a case manager for Mooresville High School’s special education programs. “They’re not just our kids anymore. They’re everybody’s kids — all teachers throughout the school. The digital conversion has evened the playing field.”

Many students adapted to the overhaul more easily than their teachers, some of whom resented having beloved tools — scripted lectures, printed textbooks and a predictable flow through the curriculum — vanish. The layoffs in 2009 and 2010, of about 10 percent of the district’s teachers, helped weed out the most reluctant, Mr. Edwards said; others he was able to convince that the technology would actually allow for more personal and enjoyable interaction with students.

“You have to trust kids more than you’ve ever trusted them,” he said. “Your teachers have to be willing to give up control.”

That was the primary concern that the 60 visitors expressed during their daylong sojourn to Mooresville in November. “I’m not sure our kids can be trusted the way these are,” one teacher from the Midwest said, speaking on the condition of anonymity to avoid trouble back home.

Thomas Bertrand, superintendent of schools in Rochester, Ill., said he was struck by the “culture of collaboration among staff and kids” in Mooresville and would emphasize that as his district considered its own conversion.

“There’s a tendency in teaching to try to control things, like a parent,” said Scott Allen, a high school chemistry teacher in South Granville, N.C. “But I learn best at my own pace, and you have to realize that students learn best at their own pace, too.”

Mooresville still has some growing pains. In one ninth-grade social studies class, a video that easily could have been shown on a large screen instead went through the students’ laptops, several of which balked, “Unable to find proxy server.” One fourth grader, having to complete 10 multiplication questions in two minutes for the software to let her move on, simply consulted her times tables, making the lesson more about speed typing than mathematics. And those concerned about corporate encroachment on public schools would blanch at the number of Apple logos in the hallways, and at the district’s unofficial slogan: “iBelieve, iCan, iWill.”

Mooresville’s tremendous focus on one data point — the percentage of students passing proficiency exams — has its pitfalls as well. At November’s quarterly data meeting, there were kudos for several numbers whose rise or dip was not statistically significant, and no recognition that the students who passed by one or two questions could very well fail by one or two the next time around. Several colorful pie charts used metrics that were meaningless.

“I realize the fallacy of looking at one measure,” Mr. Edwards said in an interview afterward. “We look at scholarships, A.P. courses taken, honors courses, SAT scores. But the measure that we use is what the state posts, and what parents look at when they’re comparing schools moving here.”

After three years of computers permeating every area of their schooling, Mooresville students barely remember life before the transformation and are somewhat puzzled by the gaggle of visitors who watch them every month. (“At times it’s kind of like being a lab rat,” one 11th grader said.) But Mooresville understands its growing fame in the world of education, much of which has yet to find the balance between old tricks and new technology.

“So,” Ms. Higgins asked her English class after the bell rang, “you think you’re going to like transcendentalism?”

“Only if you’re a nonconformist,” a student cracked.

Categories: Science/Technology | Tags: , , , | Leave a comment

In China, Buick’s for the Chic

An interesting article from the NY Times last year that I just found while cleaning off my desk.  It shows the power of perspective; in other words, how perceptions of things change depending on where you were born, how you were raised, your life experiences, etc.  

Published: November 14, 2011

BEIJING — Cars in the United States tend to come fully equipped with stereotypes. Ford Crown Victoria: law enforcement professional. Toyota Prius: upscale yuppie environmentalist. Hummer: gas-guzzling egotist.

In China, where the market for imported passenger cars dates back only about three decades, an entirely alternate set of stereotypes is taking root — and the stakes have never been higher for foreign carmakers.

Take, for example, Mercedes-Benz, a brand that in much of the world suggests moneyed respectability. In China, many people think Mercedes-Benz is the domain of the retiree.

The Buick, long associated in the United States with drivers who have a soft spot for the early-bird special, is by contrast one of the hottest luxury cars in China.

But no vehicle in China has developed as ironclad a reputation as the Audi A6, the semiofficial choice of Chinese bureaucrats. From the country’s southern reaches to its northern capital, the A6’s slick frame and invariably tinted windows exude an aura of state privilege, authority and, to many ordinary citizens, a whiff of corruption.

“Audi is still the de facto car for government officials,” said Wang Zhi, a Beijing taxi driver who has been plying the capital’s gridlocked streets for 18 years. “It’s always best to yield to an Audi — you never know who you’re messing with, but chances are it’s someone self-important.”

With annual growth hovering above 30 percent in recent years, the Chinese auto market is rapidly surpassing the United States’ as the world’s most lucrative and strategically important. Last year alone, the Chinese bought an estimated 13.8 million passenger vehicles, handily topping the 11.6 million units sold in the United States. Foreign-origin brands, most of which are manufactured in China through joint ventures, accounted for 64 percent of total sales in 2010, according to the China Association of Automobile Manufacturers.

Even if Chinese brand associations can seem remote and perhaps amusing to those outside the country, Zhang Yu, managing director of Automotive Foresight, a Shanghai industry consultancy, says they will prove decisive to sales in coming decades. “China is already the largest automobile market in the world. No car company can afford to overlook its Chinese brand,” he said.

The lower rungs of the Chinese market are still dominated by domestic brands like Chery, whose name and numerous models suggest more than a passing resemblance to Chevy. The affluent, however, are flocking to an increasingly diverse array of foreign luxury offerings. The rapid market expansion has presented some foreign carmakers with a chance for brand reinvention, while posing public relations challenges to others.

“Because the market is so young, brand perceptions and a car’s face” — an idiom meaning prestige or repute — “are both critical,” said Mr. Zhang, pointing out that 80 percent of car purchases are made by first-time buyers.

Audi’s party technocrat associations are a result partly of the car’s early market entry and its longstanding place on the government’s coveted purchasing list. Audi, the German automaker, gained access to the Chinese market in 1988 when its owner, Volkswagen, struck a joint venture with Yiqi, a Chinese carmaker. By contrast, BMW’s first domestic factory opened in 2003, giving Audi 15 years to establish itself as the premier vehicle for China’s elite.

This early advantage has helped Audi to dominate China’s lucrative government-car market, with 20 percent of its China revenue in 2009 drawn directly from government sales. Each year, the Procurement Center of the Central People’s Government releases a list of the cars and models acceptable for government purchase. While the A6 has long been a mainstay on the list, which had 38 brands in 2010, BMW made the cut only in 2009.

“When people see government officials in BMWs, they automatically suspect corruption or malfeasance — but Audis are to be expected,” said Jessica Wu, a public relations professional with almost a decade of experience in the Chinese car industry. A basic model Audi A6 costs 355,000 renminbi, or $56,000, while the BMW 5 series Li costs about 428,000 renminbi, or $67,520.

Such market positioning has brought significant financial results for Audi — in 2010, the company sold 227,938 vehicles in China, more than double the number in the United States.

The Munich-based automaker BMW, on the other hand, has found itself in a contrary position. Since entering the Chinese market, BMW has acquired a reputation as a vehicle for the arrogant and the rash, making it largely off-limits to wealthy officials who prefer a low-key public image.

Part of this stereotype is rooted in a 2003 incident in which a young female driver in the northeastern city of Harbin intentionally ran over and killed an impoverished man who had accidentally dented her BMW X5. Despite the transparent nature of the case — a clear motive and numerous eyewitnesses — the case was settled out of court for $11,000. The incident was seen as driving a wedge between China’s rich and poor, damaging BMW’s nascent image.

More recently, a driver in a BMW M6 struck and killed a pedestrian in May during an illegal street race in the city of Nanjing, setting off a public outcry.

“If it hadn’t been a BMW, I don’t think it would have been as big of a deal,” said a young man who had taken part in the race and spoke on the condition of anonymity because he was awaiting trial. “Had it been all Toyotas, Mitsubishis or even Audis, I don’t think it would have provoked as dire a reaction.”

Despite such public relations travails, BMW has posted strong sales in China, selling 121,614 units in the first two quarters of 2011, or 27 percent of the company’s total sales during that period.

The American carmaker General Motors has found the Chinese market to be a life-saving opportunity for the reinvention of the Buick brand. Since 2005, when Bob Lutz, the vice chairman of G.M., famously declared Buick a “damaged brand,” America’s oldest surviving automobile make has successfully positioned itself in China as a top-tier luxury carmaker.

Largely the result of effective marketing and remodeling, China’s romance with the Buick also has historical roots. The last Chinese emperor, Pu Yi, was the proud owner of two Buicks, as was the country’s first provisional president, Dr. Sun Yat-sen. The black Buick 8 driven by a onetime premier, Zhou Enlai, is still displayed at his former residence in Shanghai, now a museum.

In 2010, Buick sold over 550,000 cars in China, more than triple its sales in the United States.

“We joke that our market revived Buick from the dead — it’s only partly a joke,” said Liu Wen, a reporter for China Auto News.

On Sina Weibo, the country’s most popular microblogging service, a recent posting tried to sum up the car clichés. “A gathering of Mercedes indicates a get-together for old folks,” the writer said. “A group of BMWs means young nouveaux riches are about to run someone over and have a party; several Audis, and you know it’s a government meeting.”


Categories: Random Info, Science/Technology | Tags: , , , , , | Leave a comment

Why Do So Many Have Trouble Believing In Evolution?

So, I was just sitting here drinking my coffee, trying to push off the start of the school day as much as possible, when I came across this article.  It addresses an issue that I find puzzling:  Why is it so hard for people to believe in a creator god and the natural process of evolution at the same time?  I personally feel that the processes of evolution do not scientifically need a grand designer to function, but honestly, I do not see why  the two must stand at odds with each other (but then again, maybe that’s just the mediator in me coming to the surface…)

I think that the main reason so many people are against evolution is that they simply do not understand it.  And for many, the idea that “we came from monkeys” is absurd and a disgrace.  Eh, even if that were as true and simple as it sounds, I wouldn’t really have a problem with it.  But framing it in the “we came from monkeys” framework vastly oversimplifies the different processes of evolutionary that contribute to the progress of our species.

Here’s one situation that tests my patience:  Someone else and I are discussing life and evolution comes up:

Other person: “Oh yeah – if we came from monkeys, HOW COME THERE ARE STILL MONKEYS?” (And then they sit there with a smug smile on their face, as if they have just issued an intellectual check-mate, boo-ya! Now whatcha gonna say, Mr. Smarty McScience-pants?!)

And then I usually get a smug smile on my face, for two reasons: #1 to try to hide the fact that I, too, once thought this was the Achilles Heel of Evolutionary theory; #2, because that statement reveals that that person doesn’t understand even the basics of evolution. (Of course, you might not necessarily be able to blame them – they could have been like me – I went to a school in which evolution was hastily mentioned one day in science class, and that was when the teacher told us “Just so you know there is a theory out there called evolution.  If you want to know about it, you’ll have to read about it on your own; we’re not going to learn about it here because it goes against the Bible.”)

To simplify what I’ve learned after high school: Evolution is not a process of replacement; it is made up of different, branching processes.  That’s why humans did not “come from” monkeys to “replace” them.  Different apes and primates share a common ancestor, which we all branched from.  Sometime in the past, we branched off from that ancestor because we had evolved some type of advantageous difference.  That’s why humans and apes/monkeys can coexist.

And to address the other “slam dunk” against evolution:  It’s just a “theory.” Well, let it suffice to say that a scientific theory is much different than your theory of where the other sock goes in the drier.  Without even going into all of the experiments, results, facts, etc. that the scientific process uses to back up its theories, here’s a little example to show just how solid “theories” can be.  Evolution is a theory just as gravity is a theory.  WHAT?  GRAVITY? THEORY? Yeah.  No one can prove that one day we won’t drop the apple and it will fall up instead of down.  And since that can’t be proven, theory is a gravity.  Evolution shares the same “theoretical” standing as the theory that is keeping us glued to the Earth right now.

AND, I’ve just realized that I’m up on my soapbox and halfway through a rant, which was not my intention (this coffee must be stronger than I thought!).  So, I’m going to stand down now, and leave you with two things.  First is a short YouTube video in which a scientists explains the process of change known as evolution.  It helped me understand it a lot better; maybe you can benefit too.

And as far as the evolution vs. God thing: I have my opinion, but I guess that’s a personal decision.  I’m not sure how anyone can look at the evidence and not have to rethink how they’ve been interpreting the Bible…and I’m not sure why people think God couldn’t or wouldn’t use a million-year long, intricate, beautiful, and awe-inspiring process to ensure that life on the planet progressed…Anyways, I digress.  On to the video and the NPR article!

And now the NPR article: 

by MARCELO GLEISER,, 1/19/12

The evidence is clear, as in a February 2009 Gallup Poll, taken on the eve of the 200th anniversary of Charles Darwin’s birthday, that reported only 39 percent of Americans say they “believe in the theory of evolution,” while a quarter say they do not believe in the theory, and another 36 percent don’t have an opinion either way.

The same poll correlated belief in evolution with educational level: 21 percent of people with a high school education or less believed in evolution. That number rose to 41 percent for people with some college attendance, 53 percent for college graduates, and 74 percent for people with a postgraduate education.

Clearly, the level of education has an impact on how people feel about evolution.

Another variable investigated by the same poll was how belief in evolution correlates with church attendance. Of those who believe in evolution, 24 percent go to church weekly, 30 percent go nearly weekly/monthly, and 55 percent seldom or never go.

Not surprisingly, and rather unfortunately, religious belief interferes with people’s understanding of what the theory of evolution says.

The evidence for evolution is overwhelming. It’s in the fossil record, carefully dated using radioactivity, the release of particles from radioactive isotopic decay, which works like a very precise clock. Rocks from volcanic eruptions (igneous rocks) buried near a fossil carry certain amounts of radioactive material, unstable atomic nuclei that emit different kinds of radiation, like tiny bullets. The most common is Uranium-235, which decays into Lead-207. Analyzing the ratio of Uranium-235 to Lead-207 in a sample, and knowing how frequently Uranium-235 emits particles (its half-life is 704 million years, the amount half a sample decays into Lead), scientists can get a very accurate measure of the age of a fossil.

But evidence for evolution is also much more palpable, for example in the risks of overprescribing antibiotics: the more we (and farm animals) take antibiotics, the higher the chance that a microbe will mutate into one resistant to the drug. This is in-your-face evolution, species mutating at the genetic level and adapting to a new environment (in this case, an environment contaminated with antibiotics). The proof of this can be easily achieved in the laboratory (see link above), by comparing original strands of bacteria with those subjected to different doses of antibiotics. It’s simple and conclusive, since the changes in the genetic code of the resistant mutant can be identified and studied.

However, there are creationist scientists who claim that mutation is not the true mechanism of resistance. Instead, they claim that bacteria already had those genes in some sort of dormant state, which were then activated by their exposure to antibiotics. For example, Dr. Georgia Purdom argues that this inbuilt mechanism is “a testimony to the wonderful design God gave bacteria, master adapters and survivors in a sin-cursed world.” I couldn’t identify any data to back her hypothesis that bacterial resistance to antibiotics comes from horizontal gene swap and not mutation.

Does evolution really need to be such a stumbling block for so many? Is it really that bad that we descended from monkeys? Doesn’t that make us even more amazing, primates that can write poetry and design scientific experiments? Behind this strong resistance to evolution there is a deep dislike for a scientific understanding of how nature works. The problem seems to be related to the age-old God-of-the-Gaps agenda, that the more we understand of the world the less room there is for a creator God. This is bad theology, as it links belief to the development of science.

Categories: Religion, Science/Technology | Tags: , , , , | Leave a comment

TED Talks

Here’s a link to a website that I love :  TED Talks.  It’s a simply awesome collection of speeches, talks, and conversations that last anywhere from 5 to 30 minutes – perfect for our culture whose attention span has diminished since the onslaught of Twitter, Facebook statuses and 2-minute sound bites.

The topics of the talks run the gamut – from technology, to religion, to business, to adventure and compassion.  You click on the topic and will bring up all videos related to that theme.  And then you sit back and absorb some wisdom from some of the planet’s most brilliant, creative (and perhaps idealistic, but then again, what’s wrong with being idealistic) minds.  I’m hoping I can find one or two that fit into my class this semester and I can share them with my students.

The website began with offering just the twenty minute clips, but now they have TED Conventions, TED Conversations, and even TED grants and fellowships. also offers iTunes Podcasts for free:  Here.

From the About TED webpage:

“TED is a nonprofit devoted to Ideas Worth Spreading. It started out (in 1984) as a conference bringing together people from three worlds: Technology, Entertainment, Design. Since then its scope has become ever broader. Along with two annual conferences — the TED Conference in Long Beach and Palm Springs each spring, and the TEDGlobal conference in Edinburgh UK each summer — TED includes the award-winning TEDTalks video site, the Open Translation Project and TED Conversations, the inspiring TED Fellows and TEDx programs, and the annual TED Prize.”


Here are a couple of TED Talks that I found interesting this morning while drinking my tea:

Lakshmi Pratury on Letter Writing: 

For any of you out there who still cling to the idea that Islam is only a religion of violence, please educate yourself. Also, these 16 and a half minutes may shed some light on just how central a role Jesus plays in Islam. As Imam Rauf implies, it’s time to let go of our egos and practice Compassion…


Categories: Ideas & Philosophy, Politics/Current Events, Religion, Science/Technology | Tags: , , , , | Leave a comment

Exploring Stephen Hawking’s ‘Unfettered Mind’

NPR – January 3, 2012

By NPR staff

Make a list of the world’s most popular scientists, and Stephen Hawking’s name will be near or at the very top of the list.

Hawking, the author of A Brief History of Time and a professor at the University of Cambridge, is known as much for his contributions to theoretical cosmology and quantum gravity as for his willingness to make science accessible for the general public, says science writer Kitty Ferguson.

“It’s not dumbing down [science]; it’s really making it accessible, hopefully, to a lot of people,” she tells Fresh Air‘s Terry Gross.

Ferguson, who helped Hawking edit his 2001 book The Universe in a Nutshell, is the author of a new Hawking biography, Stephen Hawking: An Unfettered Mind. Written with Hawking’s blessing, the book traces his life from childhood to Oxford, and then to his graduate work at Cambridge in the early 1960s, where he was diagnosed with a motor neuron disease and given less than two years to live.

But Hawking’s disease has progressed slowly, while his personal and professional life has flourished. He celebrates his 70th birthday this January, says Ferguson, and continues to work on projects despite having very limited use of his physical body. (He communicates using a voice synthesizer, which he controls using a muscle in his cheek.)

“It’s just so interesting to see how he came to terms with [his illness],” says Ferguson. “What he says is that it wasn’t courage. [He says] ‘I just did what I had to do.’ … He took to listening to a lot of Richard Wagner, thinking of himself as a rather tragic hero. His mind went through all kinds of ways of dealing with that type of problem, but eventually, I think, he realized that theoretical physics was kind of a great escape from it.”

Science writer Kitty Ferguson sits next to Stephen Hawking in this undated photograph. Ferguson is the author of several books about physics, including Stephen Hawking: Quest for a Theory of Everything and Black Holes in Spacetime.

Hawking Radiation And A Unified Theory Of Everything

In the 1970s, Hawking discovered what is now called “Hawking radiation.” At the time, his discovery was controversial because many scientists — including Hawking — had believed that nothing could ever come out of a black hole, so a black hole could never get any smaller.

But Hawking postulated that if two individual particles were right at the edge of a black hole, and one of them happened to fall into the black hole, then the other particle could escape out into space, and appear as radiation being emitted from the black hole. Therefore, black holes could lose both mass and energy — and could, in fact, grow smaller.

Hawking’s discovery raised many questions about what goes on inside black holes and our universe itself, says Ferguson.

“[His discovery raised questions like] ‘What happens to the star that collapsed that formed the black hole? What happens to all of that when the black hole disappears entirely?’ ” she says. “And does this mean that this information is completely lost to our universe? And if it is … to physics, that’s a huge problem. Because if information can be lost from the universe, that’s a violation of a law that says it can’t disappear.”

Hawking has also pursued what is called “theory of everything,” which is conceptually an idea that there should be one theory from which everything else in the universe can be explained or derived.

“He has been predicting for most of his career that we will find it,” says Ferguson. “Recently he has decided that it’s probably going to be impossible for anybody, ever, to find the theory of everything. And this is a huge turnaround. He thinks we’ll come up with some theories that are approximations … but we’ll never be able to know the underlying mysterious theory that would really explain the entire universe.”

Interview Highlights

On time travel

“Someone recently asked him, ‘If time travel were possible, what would you go back to in your life?’ And you would expect him to say his discovery of Hawking radiation or a big prize [he’d won]. What he said was he would go back to the birth of his first child, his son Robert.”

On how Hawking communicates

“When he sees the part of the screen that has the word that he’s looking for, he punches a little mouse. Then the screen changes and we see lines of words scrolling down, and those are the words from that part of the screen. Then when he sees the word he wants, he activates his little switch again. Then you see the screen changing again and you see the words, and when he sees the next word he wants, he punches the device again. Then that word goes across the bottom of the screen. And he builds his sentence at the bottom of a screen. When he gets the sentence completed, he makes another movement, which indicates that his synthetic voice should speak that sentence. … It sounds simple, but it’s not simple. It moves at the speed of a video game, and very often he misses a word or misses the line, and then the whole thing has to start over. What that means is that working with him can be frustrating. Very often, you know what word he’s after. You know what word he wants to capture. But protocol says you do not second-guess him. You do not move ahead and say, ‘Stephen, I know what you’re trying to say.’ You let him finish. Because he’s going to finish anyway. It would be impolite, as it would be to interrupt anybody talking.”

On Hawking’s singularity theory and no-boundary proposal

“He likes to describe that as though you were traveling backward on a globe of the Earth. When you get to the South Pole, the concept of ‘south’ no longer means anything. You don’t say an airplane flew south of the South Pole. So it’s the same thing — time becomes meaningless. Now when you start to think about that, first of all, Hawking says that relieves us of need for creator. There doesn’t have to be a creation. It just started. And it was all kind of space dimensions, no time dimensions. What I find really interesting about that is that it’s not a new idea. It’s an idea you find in early Christian and Jewish philosophers like Philo of Alexandria or Augustine. They both conceived of a universe in which time didn’t exist outside of our creation. Time was part of the creator. And God exists outside of time in the eternal now. It’s the same idea. It was not new to theology, not new to philosophy, but very new to physics.”

On Hawking’s popularity

“He is popular because he deals with things right on the border of human knowledge. The origin of the universe, black holes — these are questions that are out on the edge, on the frontier between the known and unknown and the possibly unknowable. I love the phrase of what Wheeler, the American physicist, called the flaming ramparts of the world. [Hawking] tries to take us with him on this adventure, and it is fun and it’s mind-blowing and wonderful.”

On the Large Hadron Collider and the Higgs boson particle

“He has placed a bet that the Higgs particle will not be found. … One of the mysteries in physics is what gives elementary particles — electrons, quarks — their mass. Mass, we often define as how many matter particles there are in an object. That becomes a little stupid when we’re talking about a thing that is just one matter particle itself. So there’s another definition for mass, which is: the resistance you feel if you push against something … and where does that resistance come from? That’s the mystery that the Higgs particle would solve. … That’s what we’re looking for, this Higgs field. The Large Hadron Collider, what it does is accelerate particles to nearly the speed of light and then slams them together in these head-on collisons, and hopefully in the debris of one of those collisions — just in a split second — a minuscule part of the Higgs field will break away and that will be the Higgs particle. … So far, they have pretty good evidence that they may have seen it, but it’s not definitive yet.”

Categories: Science/Technology | Tags: , | 1 Comment

Science is the Poetry of Reality

A friend shared this awesome YouTube video called the Symphony of Science.

“The story of humans is the story of ideas that shed light into dark corners”

Categories: Ideas & Philosophy, Science/Technology | Tags: , , | Leave a comment

The Ethics of Mind Reading


Editor’s NotePaul Root Wolpe, Ph.D., is director of Emory University’s Center for Ethics.

By Paul Root Wolpe, Special to CNN


“My thoughts, they roam freely. Who can ever guess them?”

So goes an old German folk song. But imagine living in a world where someone can guess your thoughts, or even know them for certain. A world where science can reach into the deep recesses of your brain and pull out information that you thought was private and inaccessible.

Would that worry you?

If so, then start worrying. The age of mind reading is upon us.

Neuroscience is advancing so rapidly that, under certain conditions, scientists can use sophisticated brain imaging technology to scan your brain and determine whether you can read a particular language, what word you are thinking of, even what you are dreaming about while you are asleep.

The research is still new, and the kinds of information scientists can find through brain imaging are still simple. But the recent pace of progress in neuroscience has been startling and new studies are being published all the time.

In one experiment, researchers at Carnegie Mellon looked at images of people’s brains when they were thinking of some common objects – animals, body parts, tools, vegetables – and recorded which areas of their brains activated when they thought about each object.

The scientists studied patterns of brain activity while subjects thought about 58 such objects. Then they predicted what the person’s brain would look like if researchers gave them a brand new object, like “celery.”

The scientists’ predictions were surprisingly accurate.

Many scholars predicted as recently as a few years ago that we would never get this far. Now we have to ask: If we can tell what words you are thinking of, is it much longer before we will be able to read complex thoughts?

In another experiment, researchers at the Max Planck Institute of Psychiatry in Munich, Germany, sought out a group of “lucid dreamers” – people who remain aware that they are dreaming and even maintain some control over their dreams while they sleep.

The researchers asked the subjects to clench either their right hand or left hand in their dreams, then scanned their brain while they slept. The subjects’ motor cortex, the part of the brain that controls movement, lit up in the same manner it would if a person clenched their left hand while awake – even though the actual hand of the sleeping subjects never moved.

The images revealed that the subjects were dreaming of clenching their left fists.

Throughout human history, the inner workings of our minds were impenetrable, known only to us and, perhaps, to God. No one could see what you were thinking, or know what you were feeling, unless you chose to reveal it to them.

In fact, the idea of being able to decipher what is going on in that three pounds of grey mush between our ears seemed an impossible task even a couple of decades ago.

Now, for the first time in human history, we are peering into the labyrinth of the mind and pulling out information, perhaps even information you would rather we did not know.

Neuroscientists are actively developing technologies to create more effective lie detectors, to determine if people have been at a crime scene, or to predict who may be more likely to engage in violent crime.

As the accuracy and reliability of these experiments continue to improve, the temptation will be strong to use these techniques in counter-terrorism, in the courtroom, perhaps even at airports.

And if brain imaging for lie detection is shown to be reliable, intelligence agencies may want to use it to discover moles, employers may want to use it to screen employees, schools to uncover vandals or cheaters.

But should we allow it?

I believe not. The ability to read our thoughts threatens the last absolute bastion of privacy that we have. If my right to privacy means anything, it must mean the right to keep my innermost thoughts safe from the prying eyes of the state, the military or my employer.

My mind must remain mine alone, and my skull an inviolable zone of privacy.

Right now, our right to privacy – even the privacy of our bodies – ends when a judge issues a warrant. The court can order your house searched, your computer files exposed, and your diary read. It can also order you to submit to a blood test, take a drug screen, or to provide a DNA sample.

There is no reason, right now, that it could not also order a brain scan.

Right now, the technology is not reliable enough for the courts to order such tests. But the time is coming, and soon.

Eventually, courts will have to decide whether it is allowable to order a defendant to get a brain scan. There is even an interesting question of whether forcing me to reveal my inner thoughts through a brain scan might violate my Fifth Amendment protection against self-incrimination.

But not even a court order should be enough to violate your right to a private inner life. The musings of my mind and heart are the most precious and private possessions that I have, the one thing no one can take away from me.

Let them search my house, if they must, or take some blood, if that will help solve a case. But allowing the state to probe our minds ends even the illusion of individual liberty, and gives government power that is far too easy to abuse.

Categories: Science/Technology | Tags: , , , | 1 Comment

Fear of the TwitterBook

I’ll go ahead and say in advance – being that the new academic year has started, I will have less and less time to do any actual writing of my own for this site, which means that more and more of the content will be random tid-bits of the Net, or articles that I have come across during the day.

Maybe you’ll find them as interesting as I do.

Not too long ago, I wrote a post about technology (It’s a Small World After All).  This NPR article goes hand in hand with that one.

And it makes me think of a little bit of info that I learned in my Introduction to Cultural Anthropology course several years ago.  It is a phenomenon called habitus.  Essentially (if I remember it correctly), it is the brain rewiring (“softly”…as in it doesn’t necessarily become “hard”wired) itself to accommodate learned actions.  Let me explain:  there are some things that are hard-wired into our brains:  breathing, or ducking when a baseball is coming at our face, for example.  But there are other “reflexes” that can actually be learned.  We aren’t born with the knowledge of using a remote, yet our thumbs know just the right spot to hit (once they’ve been taught).  Another example:  keyboards.  Once you teach yourself where the keys are, your brain can “softly” rewire itself to accommodate that action.  But it’s more than just learning an action.  To your brain it’s almost as if typing on a keyboard is a reflex, because it has set itself up that way.  So, typing actually becomes like second nature to you.

This is an example of culture (outside) affecting neurology (inside).

Now, for the NPR article.  It has nothing to do with habitus directly – but it does have to do with the relationship between technology and culture.  (For example, how the invention of the clock made everyone “aware” of time – that is, they constantly thought about minutes and seconds and hours – whereas before, such concepts of “on time” “late”, or even minutes, and seconds and hours, meant nothing.)


I may not have invented the Internet but it’s possible I was the first guy to find out he was gonna be a dad through it. (It’s a long, 1980s NSFnet kind of story.) I was also, without a doubt, the first guy in Nerdville with a PalmPilot. My whole professional life has demanded early tech adoption: everything from file-transfer software to 3-D visualization to mobile computing. It was, however, only a week or so ago that I sent my first tweet.

I held off for a long time on Facebook and Twitter. Now that I’m getting deep into both, I have to ask: Why did I wait so long? More broadly and more importantly, what takes any culture or any individual “so long” in adopting new technologies?

For a world both blessed and battered by innovation what forces govern the adoption of new technologies? What leads us, as individuals, to opt into new technological modalities at particular moments in their development curve, from “hot new thing” to “everyone has one”?

And what about opting out? What happens when individuals decide to completely step away from a technological modality the rest of the culture has embraced? And how about cultures as a whole? Have entire societies ever completely dismissed a burgeoning new technological capacity?

Sometimes a technology sweeps across culture with a force that simply cannot be avoided. The first public mechanical clockappeared in Orvieto, Italy, in 1307. One hundred years later public clocks had evolved into the standard even in smaller settlements. The human experience and organization of time had become wedded to the new technology of mechanical time metering and the world was never the same.

In our technology saturated world however innovations come and go. Some, such as email and iPods, spread with the speed of an epidemic and alter the genetic (mememic) code of culture. Others, like MySpace or Sony’s failed mini-disk, flare and fade, or just fail entirely. But what is the role of individual choice embedded, as it is, in its cultural background of adoption or dismissal? What price do we pay as individuals if we decide to never pick up a technology, or to opt out once it has risen to prominence? What reasons shape these choices?

The “videophone” presents an interesting on-going example of adoption/opt-out. Pairing video and voice communication seemed the pinnacle of future-tech in Stanley Kubrick’s film2001: A Space Oddessy. Now, in 2011, the technological capacity for video calls has existed for years. As super-cool as the idea appears, the reality is that people remain luke-warm about its use. Skype and other platforms have made the service insanely simple. Still, many folks simply don’t want to be “seen” on every call and won’t use the technology unless forced into it (such as in work-related video conferencing where you are supposed to look nice anyway).

At some point video calls may become so prevalent that rejecting one will seem as Luddite as not having a telephone in your house was 30 years ago. But at this moment in videophone history, it is still possible to opt out.

Which brings me back to my avoidance of Facebook. To be honest I may have held back because of tech snobbery. I can still remember how horrified I was back in 1992 when I saw .comappearing at the end of a Web address. “You can’t use the Internet to sell things,” I thought. “It’s for learning!” (Obviously you don’t want me as your investment advisor). And to be truthful I may have avoided it (and Twitter) because, at 49, I am just getting old and crusty. But in reality, I stayed away because I did not see the point. Overwhelmed with email and texts and my omnipresent iPhone, I could not see why I wanted another node of electronic contact. And Twitter? 140 Characters? Really?

But my kids forced me onto Facebook a year ago (I demanded they friend me in the name of transparency) and my publisher pushed me to Twitter as part of my part of the launch of my new book (shameless plug here). In both cases I could immediately see I had blinded myself to how and why these platforms had launched such powerful reconfigurations of the tech-enabled cultural imagination. As my editor at NPR, Wright Bryan, puts it: “It’s the insane flexibility of these platforms that gives them so much power.”

It is the open-ended brilliance of Facebook and (as I am learning) Twitter in creating ever-shifting, ever-nested webs of connection that take them beyond themselves. Both sites may eventually be replaced by something newer. But by creating technological norms for a particular kind of connectivity, the electronic social networks they embody are transforming our historical moment as completely as mechanical time metering changed life in 15th century.

Culture sees itself and the cosmos as a whole through the lens of its technological capabilities. That fact may explain when adoption grows beyond mere choice. Once a technology settles in to the point where it begins shaping the dominant metaphors of a society (the 17th century’s “clockwork universe” for example), then there is no going back, no opting out. You and everyone you know will be assimilated.

Until that moment, however, you may still have time to hit “delete all” and quietly walk away.

Categories: Science/Technology | Tags: , , | Leave a comment

It’s a Small World After All

I just put down my phone.  I was texting one of my friends, just checking in and seeing what was going on in his life today.  He’s in Germany.  4,000 miles away.  Even in the Internet age, that we can have a real-time conversation across an ocean, on two separate continents still blows my mind.

It makes me think of something that I talked about with my students last year.  For most our history, it seemed to Earth’s inhabitants that the world was constantly growing.  Explorers and outcasts alike spread from their homes to find new lands and new peoples.  The horizons kept stretching further and further into the distance.  The discovery of the Americas by other peoples seemed to double the world’s size – at least to the outsiders’ eye.

But once the globe had been fully explored and charted, the horizons became fixed, and within generations, they seemed to begin rushing inwards.  The reason for this?  Technology.  Technology in general allowed us to see the world from a much larger viewpoint, but the technological advances in communication have done the most to advance this feeling of a shrinking world.

What used to take weeks or months to deliver a message can now be done with the press of a button.  But let’s think back even within the last 25 years – or even 15 years.  Remember when the only phone we had was a land-line at home?  And it had a long, curly cord attaching it to the wall?  Yeah, I remember that.  And I remember when cordless telephones came out, and we all felt like we were the Jetsons.  And then, eventually, the unthinkable happened:  they came out with phones that could go in your cars.  Beam me up Scotty!  You remember the first mobile phones, right?  They were as big as your house phones (and included the curly cord) and were mounted right on to your dash.  Or, if you were really mobile, you had a “bag phone” – a big bulky mobile phone that you could actually carry in and out of your car.  I remember when we got one, I thought we might as well had been the President since our bag phone was so fancy.

But what did those bulky phones do?  They allowed for more chances of communication between people.  We no longer had to wait around the kitchen for a call.  Wives were spared the annoyance of forgetting to tell their husbands to get milk.  They could simply call him on the way to the grocery store!

And then!  And then, cell phones came out.  And it has been non-stop from there!  Suddenly, people were connected to each other, no matter where they were. Our ability to communicate with each other no longer relied on cords, or time, or even space.  We each had a device that would allow us to instantly contact another.

And let’s not forget the Internet.  Oh, the Internet.  The only people who can really understand how the Internet changed the world are the ones who lived before its invention.  I actually remember not having the Internet.  Actually, I remember not having a computer in our house (mainly because I had to do my whole 4th grade Georgia History project by hand…and when my brother came to fourth grade a couple of years later, he got to type it all!).  And I don’t have to go into all of the cool things that the Internet can do, or more accurately, can allow us to do.  We all know that.

What I think is the coolest of all is what the Internet does on a larger scale.  Let me back up for just a minute.  For most of civilization’s history, access to knowledge was very limited.  If you wanted to learn a trade, you went to live with a master of that trade and you became his apprentice.  Books were scare and people who could even read them were even more rare.  That’s why if you wanted a formal education, you picked up and moved your life to live at a university.  And there weren’t thousands of students; there were a handful, and each of them studied personally under one, two or three professors.  Even if you spent your entire life studying, you could only learn what your professor taught you, plus what you could read.  Information was limited.

Now think about the Internet:  You have a question about something?  As long as you have an Internet connection, you can have 4 million posts about your inquiry within 0.23687 seconds.

Now for the part that really makes my head swim:  look at smart phones (iPhones, BlackBerrys, Andriods, etc).  They all have access to the Internet.  What does that mean?  It means that, at any given point, we now have access to more information than ALL of humanity in the past ever had, combined…All in the palm of our hand.  We don’t have to be content with not knowing what the Pythagorean Theory is.  Simply open your phone’s browser and Google it.  Each phone is now a portal to the vast ocean of shared knowledge that is the Internet.

Of course, that doesn’t mean that the information that we have access to is all good knowledge.  I don’t know the stats, but I’m sure that most of the information out there is pure crap.  Anyone with a keyboard can put whatever they want into the Internet.  And that will come up when you search.  So, you have to be smart about sifting through information.

But think about what the Internet has done beyond giving us instant access to vast amounts of knowledge.  Think about what it has done for communication.  We can now email people across the globe and they will receive our message instantly.  Researchers in France can collaborate with scientists in California to come up with a new life-saving drug.  Engineers can troubleshoot a robot an ocean away that will be used by a doctor in another state to perform a delicate surgery.

Skype allowed me not only to call home as often as I wanted to when I lived in Germany, but it allowed me to see my family via video chat.  For free.  And now, I have a free Skype app for my phone, so I can pick up and call my friend in Germany just as quickly and easily as I can my brother.

And don’t even get me started on technology’s impact on transportation (which is the other major reason the world seems smaller).  What used to take a month – or several months – in a dangerous, rocky, wind-powered ship, now takes merely hours.  I can wake up in Frankfurt and then go to sleep in Cuthbert, Georgia that same night.  I may live 1,400 miles away from my family, but I can fly down for my grandfather’s 80th birthday, and it’s no big deal.

And we’ve all seen the beauty of a sunrise and even a moonrise.  But can you imagine the feeling that those NASA astronauts had as they became the first humans in history to witness an earthrise?  All of humanity’s conflicts, all of the wars, the Roman, Greek, Chinese, Ottoman, Egyptian, European and American empires, all of our achievements – all of those took place on a little blue and white orb that those astronauts saw in their line of vision.


That’s why the world is smaller.  China no longer seems so far away when you can video chat live with a student there.  Marburg can really be a home away from home when I get constant updates from my friends there.

Yep, it’s a small world after all. 

That’s why it sometimes makes all of our disagreements seem silly.  If our entire world is so small, then our differences and problems are even smaller.  We should focus on working and living together…on making peace, and on saving our environments.


Following are two articles that I shared with my students.  They both deal with technology and the power it has to shape the way we interact with the world, the way we think, and the way we (re)act to different situations.

The first is Google, a Giant in Mobile Search, Seeks New Ways to Make it Pay by Claire Cain Miller and was published in the New York Times on April 24, 2011.  It has to do with the technology that Google is creating to improve searching for information on mobile devices such as smart phones.  Here are a couple of excerpts from the full article:

Google trained its computers to learn spoken language based on troves of voice recordings. “Even if you’re from Brooklyn and you drop all your R’s when you park your car, it’s heard plenty of people from Brooklyn and it can do well,” said Mike Cohen, head of Google’s speech technology team. 

At first, Google engineers thought people would talk to its voice search service as if they were talking to a person — “you know, it’s my anniversary, and I’d love to take my wife somewhere really romantic to eat, do you have any ideas?” — so it taught the service to filter out unnecessary words. But it turned out that Google had already trained people into thinking in keywords, so they knew to search “romantic restaurants” even when speaking instead of typing.

People can now snap photos of landmarks or wine labels to search for them using Google Goggles, speak to their phones using voice search and, on Android phones, translate spoken conversations between English and Spanish.

People can also snap a photo to translate a menu in a foreign country, and speak English to hear the Spanish translation. Someday Google hopes to be able to translate both sides of a phone conversation as it happens, said Franz Och, head of Google’s machine translation group.

This second article is from the April 12, 2011 USA TodayThe World to the Rescue  is written by Steve Sternberg and is the most amazing article (out of the two).  It shows how “for-fun” social networking sites (like Facebook) can be utilized in the most unlikely of cases – like disaster response.  Again, some excerpts to pique your attention:

Japan’s disaster has spotlighted the critical role that social media websites such as Twitter, FacebookGoogleYouTube and Skype increasingly are playing in responses to crises around the world. They may have been designed largely for online socializing and fun, but such sites and others have empowered people caught up in crises and others wanting to help to share vivid, unfiltered images, audio and text reports before governments or more traditional media can do so.

“Often, it’s not the experts who know something; it’s someone in the crowd,” says Sree Sreenivasan, a social media specialist at the Columbia Graduate School of Journalism. 

The emergency managers and military officers who planned X-24 say the idea was to tap the potential of social media to create video and text channels of communication that offer more immediacy and flexibility than the standard command-and-control operation anchored in a government war room. 

“We’re trying to reconceptualize emergency response around resources that didn’t exist five years ago,” says Craig Fugate, director of the Federal Emergency Management Administration (FEMA).

One example of the technology in the article are emergency maps that anyone can create as a free app.  Here is an excerpt describing how a Boston resident was able to help save people in Haiti after their horrible earthquake by designing such a map:

Meier put out a call for volunteers.  They began creating a crisis map in his Boston apartment.  “We were all crammed into my living room,” he says. “It was snowing outside. Here we were on a live Skype call with search-and-rescue teams in Port-au-Prince.”  Soon the Marine Corps and Coast Guard were using the program to stage relief efforts. The World Food Program sent Meier’s team a list of displaced-person camps along with a request for GPS coordinates so volunteers could locate them.

 In Japan, Meier says, colleagues familiar with the Ushahidi approach launched their own crisis map “within a couple of hours.”  It may be the largest crisis map ever created, containing more than 8,000 reports from social media detailing such items as shelters, food stores, open gas stations, road closures, building damage assessments and cellphone charging centers, he says.

You want an idea just how helpful social media sites can be in helping with disaster relief?  Check out these stats:

As many as 4 billion people worldwide — and 84% of Americans — now use mobile phones worldwide.  Twitter’s traffic is just as eye-popping, says spokesman Matt Graves.  “Right now, on any given day, people are sending 140 million messages,” he says, “a billion tweets every eight days.”  About 200 million people a day watch videos on their mobile phones, triple the number of a year ago, she says.

Compared with social media, information moves at a relative snail’s pace even in today’spost-9/11 war rooms, with their vast Internet bandwidth and huge TV screens, says Blanchard, a former deputy director of the U.S. government’s preparedness website,  “Currently, situation reports aren’t real-time,” she says. “They can be up to six to eight hours old.”

Social media can bridge that gap, she says, but emergency managers must overcome longstanding hurdles, such as policies that restrict them from acting on information that doesn’t flow from official sources.

Read the whole article here.

The article begins and ends with a remarkable story about a US Ambassador who kept up with his Twitter feed during and after the Japanese catastrophe.  He received two tweets, each about 100 characters long, from a Japanese worker, asking for help in evacuating a hospital.  It shows just how interconnected we have become:

Officials at Kameda [Japan] turned to [Ambassador] Roos. The ambassador alerted the U.S. Embassy’s defense attache, who passed it down through the U.S. military chain of command.  An hour or so later, Fuller, Roos’ aide, says, “we got a note back,” saying the patients would be evacuated by Japan’s Ground Self-Defense Forces.

Two tweets had mobilized troops.

Categories: Politics/Current Events, Science/Technology | Tags: , , , , | 2 Comments

How Much Would You Pay for the Universe?

Categories: Politics/Current Events, Science/Technology | Tags: , , , | Leave a comment

Blog at