What do you think? Leave a respectful comment.

Colleges face pressure to answer a basic question: What are students learning?

Students have returned to college campuses this fall with fresh possibilities ahead of them.

So how much will they really learn?

That’s a seemingly obvious question some universities and colleges are struggling to answer — and, in some cases, trying to avoid.

“When you look at college mission statements, they’re loaded with grand pronouncements about the skills and habits of mind they’re going to inspire in their students,” said Alexander McCormick, an associate professor of educational leadership and policy studies at Indiana University Bloomington. Yet “even as they teach their students to back up their claims with evidence, they don’t have much evidence to back up those claims.”

Higher education now is finding itself under increasing pressure to change that, just as has happened in elementary and secondary schools, where a battery of standardized tests constantly check in on what students know. The push is coming from policymakers and consumers who want to know the return on their investments in college.

“We’re starting to see a lot more interest in this area” from governors and legislators, said Robert Anderson, president of the State Higher Education Executive Officers Association. “We’ve never really had to demonstrate our learning,” said McCormick. “It’s on the agenda now, I think because of the escalating cost.”

Things have moved at a crawl, however, and not always forward.

The National Survey of Student Engagement, or NSSE (pronounced “Nessie”), which McCormick now runs, was conceived in 2000 to track the ways students spend their time in college, including how and how much they study, meet with faculty and participate in class discussions. But 18 years after its launch, only about half of the 630 institutions that participate in NSSE make their performance public.

The federal Spellings Commission on the Future of Higher Education, convened by then-U.S. Secretary of Education Margaret Spellings, called in 2006 for learning to be measured and reported publicly. In response, the two principal associations of public universities launched the Voluntary System of Accountability, or VSA, which invited colleges to administer standardized tests of student knowledge and publish the results.

Twelve years later, the VSA, too, has instead quietly become “more internally facing,” as a spokesman put it, collecting data not connected with learning that universities can use to compare their performance against others’, but that it does not make public.

READ MORE: More high school grads than ever are going to college, but 1 in 5 will quit

Also in 2006, ministers of education from Organization for Economic Co-operation and Development member nations agreed in Athens to develop a worldwide measure of learning called the Assessment of Higher Education Learning Outcomes. But that project, which was to have included U.S. universities, has been put on hold because of resistance from governments and institutions, the OECD’s director for education and skills said.

Eighteen universities and colleges have been at work more recently on what they call the Voluntary Institutional Metrics Project, or VIMS, to provide a college-by-college comparison of cost, graduation rates, employment, student debt, and what students learn. After two years, consultants setting up VIMS managed to obtain all of those measures — except for student learning, about which “limited or no data” could be collected.

Colleges’ relative resistance to publicizing whether and what their students learn is because that’s “less important than having a winning football team if you want to stay alive, in the scheme of things,” former Stanford Graduate School of Education Dean Richard Shavelson, who has studied this topic, noted wryly.

Popular rankings on which families rely when they’re picking colleges do little to track what students learn once they enroll; they’re based largely on such things as the high school grade-point averages and SAT scores of the applicants who are admitted.

What is known publicly about what and whether college students learn is mixed.

The Spellings Commission found evidence that “the quality of student learning at U.S. colleges and universities is inadequate and, in some cases, declining.” Fifty-seven percent of college graduates failed a civic literacy exam. Bar exam passing rates over the last 10 years have declined. Only 42 percent of alumni in a Gallup survey strongly agreed that they were challenged academically in college, and only 11 percent of business leaders said colleges and universities effectively prepare graduates for the workforce.

The VALUE project, which stands for Valid Assessment of Learning in Undergraduate Education, found that college students nationwide generally scored well in written communication, but less well in using evidence to support their written arguments. They could explain, but had trouble drawing conclusions about, issues. And they could do math but not necessarily apply it to real-world problems.

“We’re an industry whose primary value is prestige. And in part you get prestige by who you attract — what are the SAT scores of your incoming students,” said Richard Moore, a professor of management at California State University, Northridge, who has collected 15 years of data about where graduates of seven Cal State campuses ended up, to create a soon-to-debut public website showing how they did. “The game is set up around prestige. It’s not about effectiveness.”

There’s little doubt that students are learning in college. Not even the harshest critics say they’re not. The problem is that what they learn — presumably the principal reason most of them enroll — is not reported in a way that lets them or anyone else judge institutions by that measure.

“It is very curious,” said Carol D’Amico, a former assistant U.S. secretary of education and now executive vice president of the Strada Education Network.

“With all of the increased scrutiny of the value proposition of higher education generally, you would think people would want to be doing more of this,” said David Anderson, president of St. Olaf College, a rare example of school that does publicize its NSSE findings and its students’ performance on a standardized test of thinking and communication skills called the Collegiate Learning Assessment.

READ MORE: Worker shortage spurs uncharacteristic partnerships connecting colleges, business

Measuring learning in college, and reporting the results, is surprisingly hard to do. Translating those results in a way that laypeople can understand is even harder, obscured as they are by the dense language, acronyms and footnotes for which higher education is often caricatured.

Disagreement about what students should learn in college has also gotten in the way; nearly one in five colleges and universities have no school-wide “intended learning outcomes,” the National Institute for Learning Outcomes Assessment (NILOA), housed at the University of Illinois and Indiana University, found, though some may have lists of things they expect students to learn in particular majors.

History also plays a part in this. Not the academic subject; the history of higher education, which traditionally served a far more homogeneous student body. Now students are considerably more diverse, arriving with vastly different levels of preparation and experience, and choosing from an equally broad variety of programs, courses and majors for any number of intended careers.

“What students are supposed to be doing or learning diverges wildly,” said Nate Johnson, founder and principal consultant of the firm Postsecondary Analytics, who follows this. “You have students majoring in everything from philosophy to heating and air-conditioning repair to accounting. Even if you had measurable assessments in all those different areas, adding them up to say students made X amount of progress isn’t the same as what you can say about 9-year-olds or 10-year-olds hitting certain benchmarks in reading.”

But letting colleges say the public should just trust them to make sure their students learn the right things lets them off the hook, said Shavelson of Stanford.

“They say, ‘What we do is just far too complicated and the kinds of outcomes we produce in higher education may take years to understand,’” he said. “That’s basically the attitude. Their attitude is, don’t mess with something that’s too hard to know.”

The process is complex, as VALUE shows. A project of the Association of American Colleges & Universities and other higher education groups, VALUE required two years and 288 faculty reviewers to study 21,189 pieces of student work and rate those students’ critical and creative thinking, quantitative literacy and written and verbal communication.

“It’s not that we don’t know what our students are learning as much as it is that we’re having to talk about it, document it, represent it in a broader way for a larger audience,” said Terrel Rhodes, VALUE’s director.

As tough as that is, said Johnson, “A lot of what academics do is come up with ingenious ways to measure things like how far away is the edge of the galaxy or what’s the most effective treatment for heart disease. This seems like one where you’ve got all the intellectual resources that you need, in all our institutions.”

Whether they agree with that or not, colleges are facing more and more pressure to do this kind of work to address concerns that graduates have the basic skills they need to get and change jobs through their careers.

“This has moved from being a fad to, ‘This needs to be part of our daily practice,’” said NILOA director Natasha Jankowski.

Thirteen states are now pushing for this sort of disclosure. Massachusetts, among the farthest along, has already made the VALUE results publicly available (above the national average in writing, below in math) from four of its universities and two community colleges; the rest are to be added later.

READ MORE: With enrollment sliding, liberal arts colleges struggle to make a case for themselves

Even there, those results are provided for the schools collectively, and not by campus. Nor does VALUE disclose its findings on an institution-by-institution level.

That’s the next big challenge, said McCormick and others. Students and families “don’t just want to know what do learning outcomes look like at St. Olaf. They want to be able to compare it to other colleges where they apply. But once you make that information public, the calculus gets much trickier for an institutional leader. They start focusing on protecting their image, protecting their brand.”

Several other efforts are under way to set targets for, and measure, learning. NILOA has devised a Degree Qualifications Profile outlining what students should know at any degree level and in any major. “While some colleges and universities have defined their own expected student learning outcomes, what they have done has been largely invisible to students, policy leaders, the public and employers,” it asserts. (The Degree Qualifications Profile is funded by the Lumina Foundation and VALUE and VIMS by Lumina and the Bill & Melinda Gates Foundation, among others; both also support The Hechinger Report, which produced this story.)

Technology companies may come up with new ways of measuring learning, Johnson said. “I would not be at all surprised if there were some developments of a platform for assessing learning that was both sophisticated and simple at the same time,” he said.

The public “should have this information. They should be expecting it,” said Rhodes. “And this work is actually percolating across the country. But it’s new terrain.”

Editor’s Note: A paragraph was changed so as not to suggest that colleges opted out of the VSA to avoid making learning outcomes public. The institutions that opted out did so for a variety of reasons.

This story about what students learn in college was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up here for Hechinger’s higher-education newsletter.