Tag Archives: Testing

Dishonesty on the part of adults in schools

1 Apr

Ronda Cook reports in the Atlanta Journal Consttition article, APS officials to begin surrendering about the recent example of adults cheating to produce higher test scores:

Thirty-five former Atlanta public school employees were named in a 65-count indictment returned Friday alleging racketeering, false statements and writings and other charges related to alleged cheating on standardized test scores and the covering up of those actions.

Retired Atlanta school Superintendent Beverly Hall, some of her top deputies, principals, teachers and a secretary have until Tuesday to turn themselves in. Once processed in the jail, they will have to go before a magistrate, where bond is discussed. The grand jury said Hall’s bond should be set at $7.5 million, but the judge can set a lesser amount. http://www.ajc.com/news/news/aps-officials-to-begin-surrendering/nW72c/

See, Standardized Test Cheating http://www.huffingtonpost.com/news/standardized-test-cheating

Moi wrote about cheating teachers in ACT to assess college readiness for 3rd-10th Grades:

There have been a number of cheating scandals over the past couple of years. Benjamin Herold has a riveting blog post at The Notebook which describes itself as “An independent voice for parents, educators, students, and friends of Philadelphia Public Schools.” In the post, Confession of A Cheating Teacher Herold reports:

She said she knows she’s a good teacher.

But she still helped her students cheat.

What I did was wrong, but I don’t feel guilty about it,” said a veteran Philadelphia English teacher who shared her story with the Notebook/NewsWorks.

During a series of recent interviews, the teacher said she regularly provided prohibited assistance on the Pennsylvania System of School Assessment (PSSA) exams to 11th graders at a city neighborhood high school. At various times, she said, she gave the students definitions for unfamiliar words, discussed with students reading passages they didn’t understand, and commented on their writing samples.

On a few occasions, she said, she even pointed them to the correct answers on difficult questions.

They’d have a hard time, and I’d break it down for them,” said the teacher matter-of-factly.

Such actions are possible grounds for termination. As a result, the Notebook/NewsWorks agreed to protect her identity.

The teacher came forward following the recent publication of a 2009 report that identified dozens of schools across Pennsylvania and Philadelphia that had statistically suspicious test results. Though her school was not among those flagged, she claims that adult cheating there was “rampant.”

The Notebook/NewsWorks is also withholding the name of her former school. because the details of her account have been only partially corroborated.

But her story seems worth telling.

During multiple conversations with the Notebook/NewsWorks, both on the phone and in person, the teacher provided a detailed, consistent account of her own actions to abet cheating. Her compelling personal testimonial highlighted frequently shared concerns about the conditions that high-stakes testing have created in urban public schools. The Notebook and NewsWorks believe that her confession sheds important light on the recent spate of cheating scandals across the country….

She said she knows she’s a good teacher.

But she still helped her students cheat.

What I did was wrong, but I don’t feel guilty about it,” said a veteran Philadelphia English teacher who shared her story with the Notebook/NewsWorks.

During a series of recent interviews, the teacher said she regularly provided prohibited assistance on the Pennsylvania System of School Assessment (PSSA) exams to 11th graders at a city neighborhood high school. At various times, she said, she gave the students definitions for unfamiliar words, discussed with students reading passages they didn’t understand, and commented on their writing samples.

On a few occasions, she said, she even pointed them to the correct answers on difficult questions.

They’d have a hard time, and I’d break it down for them,” said the teacher matter-of-factly.

Such actions are possible grounds for termination. As a result, the Notebook/NewsWorks agreed to protect her identity.

The teacher came forward following the recent publication of a 2009 report that identified dozens of schools across Pennsylvania and Philadelphia that had statistically suspicious test results. Though her school was not among those flagged, she claims that adult cheating there was “rampant.”

The Notebook/NewsWorks is also withholding the name of her former school. because the details of her account have been only partially corroborated.

But her story seems worth telling.

During multiple conversations with the Notebook/NewsWorks, both on the phone and in person, the teacher provided a detailed, consistent account of her own actions to abet cheating. Her compelling personal testimonial highlighted frequently shared concerns about the conditions that high-stakes testing have created in urban public schools. The Notebook and NewsWorks believe that her confession sheds important light on the recent spate of cheating scandals across the country.

One might ask what the confessions of a cheating teacher have to do with the announcement by ACT that they will begin offering a series of assessments to measure skills needed in high school and college. Although, it is in the early stage of development, one could question whether this assessment will turn into a high-stakes test with pressures on students, teachers, and schools. Admittedly, it is early. https://drwilda.com/2012/07/04/act-to-assess-college-readiness-for-3rd-10th-grades/

Valerie Strauss reports in the Washington Post article, 50 ways adults in schools ‘cheat’ on standardized tests:

Pre-Testing
Fail to store test materials securely
Encourage teachers to view test forms before they are administered
Teach to the test by ignoring subjects not on exam
Drill students on actual test items
Share test items on Internet before administration
Practice on copies of previously administered “secure” tests
Exclude likely low-scorers from enrolling in school
Hold-back low scorers from tested grade
“Leap-frog” promote some students over tested grade
Transfer likely low-scoring students to charter schools with no required tests
Push likely low scorers out of school or enroll them in GED programs
Falsify student identification numbers so low scorers are not assigned to correct demographic group
Urge low-scoring students to be absent on test day
Leave test materials out so students can see them before exam

During Testing
Let high-scorers take tests for others
Overlook “cheat sheets” students bring into classroom
Post hints (e.g. formulas, lists, etc) on walls or whiteboard
Write answers on black/white board, then erase before supervisor arrives
Allow students to look up information on web with electronic devices
Allow calculator use where prohibited
Ignore test-takers copying or sharing answers with each other
Permit students to go to restroom in groups
Shout out correct answers
Use thumbs up/thumbs down signals to indicate right and wrong responses
Tell students to “double check” erroneous responses
Give students notes with correct answers
Read “silent reading” passages out loud
Encourage students who have completed sections to work on others
Allow extra time to complete test
Leave classroom unattended during test
Warn staff if test security monitors are in school
Refuse to allow test security personnel access to testing rooms
Cover doors and windows of testing rooms to prevent monitoring
Give accommodations to students who didn’t officially request them

Post-Testing
Allow students to “make up” portions of the exam they failed to complete
Invite staff to “clean up” answer sheets before transmittal to scoring company
Permit teachers to score own students’ tests
Fill in answers on items left blank
Re-score borderline exams to “find points” on constructed response items
Erase erroneous responses and insert correct ones
Provide false demographic information for test takers to assign them to wrong categories
Fail to store completed answer sheets securely
Destroy answer sheets from low-scoring students
Report low scorers as having been absent on testing day
Share content with educators/students who have not yet taken the test
Fail to perform data forensics on unusual score gains
Ignore “flagged” results from erasure analysis
Refuse to interview personnel with potential knowledge of improper practices
Threaten discipline against testing impropriety whistle blowers
Fire staff who persist in raising questions
Fabricate test security documentation for state education department investigators
Lie to law enforcement personnel                                     http://www.washingtonpost.com/blogs/answer-sheet/wp/2013/03/31/50-ways-adults-in-schools-cheat-on-standardized-tests/

Here is the press release from Fair Test:

FairTest Press Release: Standardized Exam Cheating In 37 States And D.C.; New Report Shows Widespread Test Score Corruption

Submitted by fairtest on March 27, 2013 – 11:32pm

for further information:
Bob Schaeffer (239) 395-6773
cell  (239) 699-0468

for immediate release, Thursday, March 28, 2013

STANDARDIZED EXAM CHEATING CONFIRMED IN 37 STATES AND D.C.;
NEW REPORT SHOWS WIDESPREAD TEST SCORE CORRUPTION

As an Atlanta grand jury considers indictments against former top school officials in a test cheating scandal and the annual wave of high-stakes standardized exams begins across the nation, a new survey reports confirmed cases of test score manipulation in at least 37 states and Washington, D.C. in the past four academic years. The analysis by the National Center for Fair & Open Testing (FairTest) documents more than 50 ways schools improperly inflated their scores during that period.

Across the U.S., strategies that boost scores without improving learning — including outright cheating, narrow teaching to the test and pushing out low-scoring students — are widespread,” said FairTest Public Education Director Bob Schaeffer. “These corrupt practices are inevitable consequences of the politically mandated overuse and misuse of high-stakes exams.”

Among the ways FairTest found test scores have been manipulated in communities such as Atlanta, Baltimore, Cincinnati, Detroit, El Paso, Houston, Los Angeles, Newark, New York City, Philadelphia and the District of Columbia:

  • Encourage teachers to view upcoming test forms before they are administered.
  • Exclude likely low-scorers from enrolling in school.
  • Drill students on actual upcoming test items.
  • Use thumbs-up/thumbs-down signals to indicate right and wrong responses.
  • Erase erroneous responses and insert correct ones.
  • Report low-scorers as having been absent on testing day.

Schaeffer continued, “The solution to the school test cheating problem is not simply stepped up enforcement. Instead, testing misuses must end because they cheat the public out of accurate data about public school quality at the same time they cheat many students out of a high-quality education.”

The cheating explosion is one of the many reasons resistance to high-stakes testing is sweeping the nation,” Schaeffer concluded.

– – 3 0 – –

Attached:    

Attachment Size
CheatingReportsList.pdf 113.99 KB
Cheating-50WaysSchoolsManipulateTestScores.pdf 171.74 KB

Moi wrote in The military mirrors society:

Here’s today’s COMMENT FROM AN OLD FART: Despite the fact that those in high places are routinely outed for lapses in judgment and behavior unbecoming the office or position they have been entrusted with, many continue to feign surprise at the lapse. Really, many are feigning the surprise at the stupidity of the seemingly bright and often brilliant folk who now have to explain to those close and the public about the stupidity which brought their lives to ruin. Some how the “devil made me do it” does not quite fully explain the hubris. The hubris comes from a society and culture where ME is all that counts and there are no eternals. There is only what exists in this moment. http://drwildaoldfart.wordpress.com/2012/11/13/the-military-mirrors-society/

Related:

Cheating in schools goes high-tech https://drwilda.com/2011/12/21/cheating-in-schools-goes-high-tech/

What , if anything, do education tests mean? https://drwilda.wordpress.com/2011/11/27/what-if-anything-do-education-tests-mean/

Suing to get a better high school transcript after cheating incident

https://drwilda.com/tag/parents-who-sued-school-over-sons-punishment-for-cheating-receive-hate-messages/

Where information leads to Hope. ©                  Dr. Wilda.com

Dr. Wilda says this about that ©

Blogs by Dr. Wilda:

COMMENTS FROM AN OLD FART©                      http://drwildaoldfart.wordpress.com/

Dr. Wilda Reviews ©                                             http://drwildareviews.wordpress.com/

Dr. Wilda ©                                                                                                      https://drwilda.com/

College Board to redesign SAT test

3 Mar

Moi wrote in College readiness: What are ‘soft skills’:

Whether or not students choose college or vocational training at the end of their high school career, our goal as a society should be that children should be “college ready.” David T. Conley writes in the ASCD article, What Makes a Student College Ready? http://www.ascd.org/publications/educational-leadership/oct08/vol66/num02/What-Makes-a-Student-College-Ready%C2%A2.aspx https://drwilda.com/2012/10/06/many-not-ready-for-higher-education/

https://drwilda.com/2012/11/14/college-readiness-what-are-soft-skills/

There are two primary tests which access student preparedness for college, the ACT and the SAT. The SAT is owned by the College Board which has announced they will be redesigning the test. The ACT has overtaken the ACT as the primary test assessment.

Valerie Strauss reports in the Washington Post article, SAT exam to be redesigned:

The College Board, the nonprofit organization that owns the SAT, late last year appointed a new president, David Coleman, who was a co-writer of the Common Core State Standards. In a recent speech at the Brookings Institution, Coleman said he has a number of problems with the SAT as now written, including with its essay and vocabulary words. (You can read about that here.)

College Board Vice President Peter Kauffmann said the following e-mail was sent to all members of the College Board:

In the months ahead, the College Board will begin an effort in collaboration with its membership to redesign the SAT® so that it better meets the needs of students, schools, and colleges at all levels. We will develop an assessment that mirrors the work that students will do in college so that they will practice the work they need to do to complete college. An improved SAT will strongly focus on the core knowledge and skills that evidence shows are most important to prepare students for the rigors of college and career. This is an ambitious endeavor, and one that will only succeed with the leadership of our Board of Trustees, the strong coordination of our councils and committees, and the full engagement of our membership.

First administered in 1926, the SAT was created to democratize access to higher education for all students. Today the SAT serves as both a measure of students’ college and career readiness and a predictor of college outcomes. In its current form, the SAT is aligned to the Common Core as well as or better than any assessment that has been developed for college admission and placement, and serves as a valuable tool for educators and policymakers. While the SAT is the best standardized measure of college and career readiness currently available, the College Board has a responsibility to the millions of students we serve each year to ensure that our programs are continuously evaluated and enhanced, and most importantly respond to the emerging needs of those we serve.

As we begin the redesign process, there are three broad objectives that will drive our work:

Increase the value of the SAT to students by focusing on a core set of knowledge and skills that are essential to college and career success; reinforcing the practice of enriching and valuable schoolwork; fostering greater opportunities for students to make successful transitions into postsecondary education; and ensuring equity and fairness.

Increase the value of the SAT to higher education professionals by ensuring that the SAT meets the evolving needs of admission officers, faculty, and other administrators, and that the SAT remains a valid and reliable predictor of college success.

Increase the value of the SAT to K–12 educators, administrators and counselors by strengthening the alignment of the SAT to college and career readiness; ensuring that the content reflects excellence in classroom instruction; and developing companion tools that allow educators to use SAT results to improve curriculum and instruction.Bob Schaeffer, public education director of FairTest, a nonprofit organization dedicated to ending the misuse of standardized tests, said this about the redesign:

The College Board’s announcement that it plans to revise its flagship exam, less than eight years after the previous “major overhaul” of the test was first administered, is an admission that  the highly touted “new SAT” introduced in 2005 was a failure. The latest version of the test is, in fact, no better than its predecessor in predicting academic success in higher education or in creating a level playing field to assess an increasingly diverse student body. The only significant changes were that it was longer and cost test-takers more. As a result, more than 80 additional institutions have adopted test-optional or test flexible policies (attached), and the ACT overtook the SAT as the nation’s most popular exam for colleges which still require a test. Those developments left the new College Board leadership with no choice but to try to “reformulate” its product in an effort to maintain market share and relevance. http://www.washingtonpost.com/blogs/answer-sheet/wp/2013/02/26/sat-exam-to-be-redesigned/

See, College Board Announces Sweeping SAT Redesign http://www.educationnews.org/higher-education/college-board-announces-sweeping-sat-redesign/

K-12 education must not only prepare students by teaching basic skills, but they must prepare students for training after high school, either college or vocational. There should not only be a solid education foundation established in K-12, but there must be more accurate evaluation of whether individual students are “college ready.”

Related:

What , if anything, do education tests mean? https://drwilda.wordpress.com/2011/11/27/what-if-anything-do-education-tests-mean/

Complete College America report: The failure of remediation https://drwilda.wordpress.com/2012/06/21/complete-college-america-report-the-failure-of-remediation/

What the ACT college readiness assessment means https://drwilda.com/2012/08/25/what-the-act-college-readiness-assessment-means/

The importance of the National Assessment of Educational Progress                                                          https://drwilda.com/2012/09/12/the-importance-of-the-national-assessment-of-educational-progress/

Where information leads to Hope. ©                 Dr. Wilda.com

Dr. Wilda says this about that ©

Blogs by Dr. Wilda:

COMMENTS FROM AN OLD FART©                         http://drwildaoldfart.wordpress.com/

Dr. Wilda Reviews ©                                              http://drwildareviews.wordpress.com/

Dr. Wilda ©                                                                                https://drwilda.com/

Research paper: Interpreting international test scores in light of social class differences

15 Jan

Moi wrote about international student rankings in Important Harvard report about U.S. student achievement ranking:

More and more, individuals with gravitas are opining about the American education system for reasons ranging from national security to economic competitiveness. In Condoleezza Rice and Joel Klein report about American Education, moi wrote:

The Council on Foreign Relations has issued the report, U.S. Education Reform and National Security. The chairs for the report are Joel I. Klein, News Corporation and Condoleezza Rice, Stanford University. Moi opined about the state of education in U.S. education failure: Running out of excuses https://drwilda.wordpress.com/2011/12/13/u-s-education-failure-running-out-of-excuses/ Education tends to be populated by idealists and dreamers who are true believers and who think of what is possible. Otherwise, why would one look at children in second grade and think one of those children could win the Nobel Prize or be president? Maybe, that is why education as a discipline is so prone to fads and the constant quest for the “Holy Grail” or the next, next magic bullet. There is no one answer, there is what works for a particular population of kids. https://drwilda.wordpress.com/2012/03/19/condoleezza-rice-and-joel-klein-report-about-american-education/

Joy Resmovits reports at Huffington Post that the meaning of international test comparisons do not provide an accurate picture.

In International Test Scores Often Misinterpreted To Detriment Of U.S. Students, Argues New EPI Study, Resmovits reports:

Lawmakers should be more careful when using international test scores to drive education policy, argues a pair of researchers in a new paper for the left-leaning think tank Economic Policy Institute — because the results aren’t always what they appear to be.

According to a new paper released Wednesday, the average scores on international tests — the numbers over which advocates and politicians do much public hand-wringing — don’t tell the whole story of America’s academic performance, and inferences based on those averages can be misleading, Stanford education professor Martin Carnoy and researcher Richard Rothstein argue. They found that contrary to popular belief, international testing information shows that America’s low-income students have been improving over time…

Rothstein found that the U.S. is more unequal in social background, so he wondered whether differences between the average U.S. scores and those of its competitors were driven by that inequality. Rothstein said he was not surprised by his findings, given that the achievement gap between rich and poor U.S. students has always been large. “Higher social class students have higher average scores than lower social class students,” he said. http://www.huffingtonpost.com/2013/01/15/international-test-scores_n_2479994.html?utm_hp_ref=education

Here is a portion of the executive summary:

What do international tests really show about U.S. student performance?

By Martin Carnoy, Stanford Graduate School of Education and EPI
and Richard Rothstein, EPI

 View PDF

Download PDF

This report, however, shows that such inferences are too glib. Comparative student performance on international tests should be interpreted with much greater care than policymakers typically give it. This care is essential for three reasons:

  • First, because academic performance differences are produced by home and community as well as school influences, there is an achievement gap between the relative average performance of students from higher and lower social classes in every industrialized nation. Thus, for a valid assessment of how well American schools perform, policymakers should compare the performance of U.S. students with that of students in other countries who have been and are being shaped by approximately similar home and community environments….

We have shown that U.S. student performance, in real terms and relative to other countries, improves considerably when we estimate average U.S. scores after adjusting for U.S. social class composition and for a lack of care in sampling disadvantaged students in particular. With these adjustments, U.S. scores would rank higher among OECD countries than commonly reported in reading—fourth best instead of 14th—and in mathematics—10th best instead of 25th.

  • Second, to be useful for policy purposes, information about student performance should include how this performance is changing over time. It is not evident what lessons policymakers should draw from a country whose student performance is higher than that in the United States, if that country’s student performance has been declining while U.S student performance has been improving…. performance of all students in such countries obscures the performance of disadvantaged students.

This caution especially pertains to conventional attention to comparisons of the United States and higher-scoring Finland. Although Finland’s average scores, and scores for the most-disadvantaged children, remain substantially higher than comparable scores in the United States, scores in the United States for disadvantaged children have been rising over time, while Finland’s scores for comparable children have been declining. American policymakers should seek to understand these trends before assuming that U.S. education practice should imitate practice in Finland.

As well, U.S. trends for disadvantaged children’s PISA achievement are much more favorable than U.S. trends for advantaged children. In both reading and math, disadvantaged children’s scores have been improving while advantaged student’s scores have been stagnant. U.S. policy discussion assumes that most of problems of the U.S. education system are concentrated in schools serving disadvantaged children. Trends in PISA scores suggest that the opposite may be the case.

  • Third, different international and domestic tests sometimes seem to show similar trends, but sometimes seem quite inconsistent. These inconsistencies call into question conclusions drawn from any single assessment, and policymakers should attempt to understand the complex causes of these inconsistencies….

In our comparisons of U.S. student performance on the PISA test with student performance in six other countries—three similar post-industrial economies (France, Germany, and the United Kingdom) and three countries whose students are “top scoring” (Canada, Finland, and Korea)—we conclude that, in reading:

  • Higher social class (Group 5) U.S. students now perform as well as comparable social class students in all six comparison countries.
  • Disadvantaged students perform better (in some cases, substantially better) than disadvantaged students in the three similar post-industrial countries, but substantially less well than disadvantaged students in the three top-scoring countries.
  • The reading achievement gap between advantaged and disadvantaged students in the United States is smaller than the gap in the three similar post-industrial countries, but larger than the gap in the top-scoring countries….

These comparisons suggest that much of the discussion in the United States that points to international test comparisons to contend that U.S. schools are “failing” should be more nuanced. Although claims about relative U.S. school failure often focus on disadvantaged students’ performance, international data show that U.S. disadvantaged student performance has improved over the past decade in both mathematics and reading compared to similar social class students in all our comparison countries except Germany. TIMSS and NAEP data also show improvement for all social class groups in mathematics during the last decade. Should we consider these improvements a failure, particularly when the scores of disadvantaged students in all comparison countries but Germany have declined in this same period? http://www.epi.org/publication/us-student-performance-testing/

The increased rate of poverty has profound implications if this society believes that ALL children have the right to a good basic education. Moi blogs about education issues so the reader could be perplexed sometimes because moi often writes about other things like nutrition, families, and personal responsibility issues. Why? The reader might ask? Because children will have the most success in school if they are ready to learn. Ready to learn includes proper nutrition for a healthy body and the optimum situation for children is a healthy family. Many of societies’ problems would be lessened if the goal was a healthy child in a healthy family. There is a lot of economic stress in the country now because of unemployment and underemployment. Children feel the stress of their parents and they worry about how stable their family and living situation is.

Teachers and schools have been made TOTALLY responsible for the education outcome of the children, many of whom come to school not ready to learn and who reside in families that for a variety of reasons cannot support their education. All children are capable of learning, but a one-size-fits-all approach does not serve all children well. Different populations of children will require different strategies and some children will require remedial help, early intervention, and family support to achieve their education goals.

Related:

Report from Center for American Progress report: Kids say school is too easy                                                                           https://drwilda.wordpress.com/2012/07/10/report-from-center-for-american-progress-report-kids-say-school-is-too-easy/

Complete College America report: The failure of remediation https://drwilda.wordpress.com/2012/06/21/complete-college-america-report-the-failure-of-remediation/

Book: Inequality in America affects education outcome https://drwilda.wordpress.com/2012/06/10/book-inequality-in-america-affects-education-outcome/

What exactly are the education practices of top-performing nations?                                       http://drwilda.wordpress.com/2012/05/28/what-exactly-are-the-education-practices-of-top-performing-nations/

Where information leads to Hope. ©                 Dr. Wilda.com

Dr. Wilda says this about that ©

Blogs by Dr. Wilda:

COMMENTS FROM AN OLD FART©                             http://drwildaoldfart.wordpress.com/

Dr. Wilda Reviews ©                                               http://drwildareviews.wordpress.com/

Dr. Wilda ©                                                                                    https://drwilda.com/

Studies: Current testing may not adequately assess student abilities

22 Nov

Moi wrote about testing in More are questioning the value of one-size-fits-all testing:

Joy Resmovits has an excellent post at Huffington Post. In Standardized Tests’ Measures of Student Performance Vary Widely: Study Resmovits reports:

The report, written by the Education Department’s National Center for Education Statistics, found that the definition of proficiency on standardized tests varies widely among states, making it difficult to assess and compare student performance. The report looked at states’ standards on exams and found that some states set much higher bars for students proficiency in particular subjects.

The term “proficiency” is key because the federal No Child Left Behind law mandates that 100 percent of students must be “proficient” under state standards by 2014 — a goal that has been universally described as impossible to reach….

They found many states deemed students “proficient” by their own standards, but those same students would have been ranked as only “basic” — defined as “partial mastery of knowledge and skills fundamental for proficient work at each grade” — under NAEP.

The implication is that students of similar academic skills but residing in different states are being evaluated against different standards for proficiency in reading and mathematics,” the report concludes….

Here is the report citation:

Mapping State Proficiency Standards Onto NAEP Scales: Variation and Change in State Standards for Reading and Mathematics, 2005-2009

August 10, 2011

Author: Victor Bandeira de Mello

PDF Download the complete report in a PDF file for viewing and printing. (1959K PDF)

W.M. Chambers cautioned about testing in a 1964 Journal of General Education article, Testing And Its Relationship To Educational Objectives. He questioned whether testing supported the objectives of education rather than directing the objectives.

Here is the complete citation:

Penn State University Press Testing And Its Relationship To Educational Objectives

W. M. Chambers

The Journal of General Education
Vol. 16, No. 3 (October 1964), pp. 246-249
(article consists of 4 pages)

Published by:

Stable URL: http://www.jstor.org/stable/27795936

Sarah D. Sparks writes in the Education Week article, Today’s Tests Seen as Bar to Better Assessment:

The use of testing in school accountability systems may hamstring the development of tests that can actually transform teaching and learning, experts from a national assessment commission warn.

Members of the Gordon Commission on the Future of Assessment in Education, speaking at the annual meeting of the National Academy of Education here Nov. 1-3, said that technological innovations may soon allow much more in-depth data collection on students, but that current testing policy calls for the same test to fill too many different and often contradictory roles.

The nation’s drive to develop standards-based accountability for schools has led to tests that, “with only few exceptions, systematically overrepresent basic skills and knowledge and omit the complex knowledge and reasoning we are seeking for college and career readiness,” the commission writes in one of several interim reportsRequires Adobe Acrobat Reader discussed at the Academy of Education meeting.

“We strongly believe that assessment is a primary component of education, … [part of] the trifecta of teaching, learning, and testing,” said Edmund W. Gordon, the chairman of the commission and a professor emeritus of psychology at Yale University and Teachers College, Columbia University.

The two-year study group launched in 2011 with initial funding from the Princeton, N. J.-based Educational Testing Service and a membership that reads like a who’s who of education research and policy. Its 32 members include: author and education historian Diane Ravitch of New York University, former West Virginia Gov. Bob Wise of the Washington-based Alliance for Excellent Education, and cognitive psychologist Lauren Resnick of the University of Pittsburgh, among others.

The panel is developing recommendations for both research on new assessments—for the Common Core State Standards and others—and policy for educators on how to use tests appropriately. The final recommendations, expected at the end of the year, will be based on two dozen studies and analyses from experts in testing on issues of methods, student privacy, and other topics….

Related Stories

http://www.edweek.org/ew/articles/2012/11/14/12tests.h32.html?tkn=UMWFIAqhQGc%2Fi5o3iVfxpKJ7Mx2ZMahHFZ7L&cmp=clp-edweek&intc=es

There are education scholars on all sides of the testing issue.

Moi wrote in What, if anything, do education tests mean?

Every population of kids is different and they arrive at school at various points on the ready to learn continuum. Schools and teachers must be accountable, but there should be various measures of judging teacher effectiveness for a particular population of children. Perhaps, more time and effort should be spent in developing a strong principal corps and giving principals the training and assistance in evaluation and mentoring techniques. There should be evaluation measures which look at where children are on the learning continuum and design a program to address that child’s needs.               https://drwilda.com/2011/11/27/what-if-anything-do-education-tests-mean/

Dr. Wilda says this about that ©

Blogs by Dr. Wilda:

COMMENTS FROM AN OLD FART © http://drwildaoldfart.wordpress.com/

Dr. Wilda Reviews ©                           http://drwildareviews.wordpress.com/

Dr. Wilda ©                                                                                https://drwilda.com/

The importance of the National Assessment of Educational Progress

12 Sep

Moi wrote in What, if anything, do education tests mean?

Moi received a review copy from Princeton University Press of Howard Wainer’s Uneducated Guesses. The publication date was September 14, 2011. In the preface Wainer states the goal of the book, “It deals with education in general and the use of tests and test scores in support of educational goals in particular.” Wainer tries to avoid not only the policy, but the ethical analysis of the analysis of the improper use of tests and test results by tightly defining the objective of the book at page four. The policy implications of using tests and test results to not only decide the direction of education, but to decide what happens to the participants in education are huge. Moi wonders if Wainer was really trying to avoid the unavoidable?

For moi, the real meat of the book comes in chapter 4. Wainer says:

In chapter 3 we learned that the PSAT, the shorter and easier version of the SAT, can be used effectively as one part of the selection decision for scholarships. In this chapter we expand on this discussion to illustrate that the PSAT also provides evidence that can help us allocate scarce educational resources…. [Emphasis Added]

Wainer examines the connection by analyzing and comparing test results from three high school districts. Those schools are Garfield High School in L.A., the site of the movie “Stand and Deliver.” La Canada High School in an upscale L.A. Suburb and Detroit, a very poor inner city school district. The really scary policy implication of Wainer’s very thorough analysis is found at page 44, “Limited resources mean that choices must be made.” Table 4-4 illustrates that real life choices are being made by districts like Detroit. What is really scary is that these choices affect the lives of real human beings. Of course, Wainer is simply the messenger and can’t be faulted for his analysis. According to Wainer, it is very tricky to use test results in predicting school performance and his discussion at page 53 summarizes his conclusions.

Perhaps the most chilling part of Wainer’s book is chapter 8 which deals with how testing and test results can adversely impact the career of a teacher when so-called “experts” incorrectly analyze test data. It should be required reading for those who want to evaluate teacher performance based upon test results.

Overall, Uneducated Guesses is a good, solid, and surprisingly readable book about test design, test results, and the use of test results. The truly scary part of the book describes how the uninformed, unknowing, and possibly venal can use what they perceive to be the correct interpretation to make policy judgments which result in horrific societal consequences.

Wainer makes statistics as readable as possible, because really folks, it is still statistics.

Here is the full citation for the book:

Uneducated Guesses: Using Evidence to Uncover Misguided Education Policies

Howard Wainer

Cloth: $24.95 ISBN: 9780691149288

200pp.

https://drwilda.com/2011/11/27/what-if-anything-do-education-tests-mean/

Many do not know about the National Assessment of Educational Progress (NAEP). Here is a description of the test:

NAEP Overview

http://nces.ed.gov/nationsreportcard/about/

Here are some FAQs:

Frequently Asked Questions

The National Assessment of Educational Progress (NAEP) is a program with many components—from developing subject-area questions, to selecting schools to participate, to reporting the results. Given its complexity, NAEP receives a variety of questions from visitors to the website; these special pages have been developed to provide answers to some of the most common questions.

If you can’t find the answer to your question on any of our FAQ pages, please click Contact NAEP on the left.

General Questions 

What is NAEP?

NAEP, or the National Assessment of Educational Progress, produces the Nation’s Report Card, to inform the public about the academic achievement of elementary and secondary students in the United States. Sponsored by the department of Education, NAEP assessments have been conducted periodically in reading, mathematics, science, writing, U.S. history, civics, geography, and other subjects, beginning in 1969. NAEP collects and reports academic achievement at the national level, and for certain assessments, at the state and district levels. The results are widely reported by the national and local media, and are an integral part of our nation’s evaluation of the condition and progress of education.

For more general information about NAEP, read the NAEP Overview.
For technical information about NAEP, consult the NAEP Technical Documentation.

What is the difference between state NAEP and national NAEP?

The NAEP sample in each state is designed to be representative of the students in that state. At the state level, results are currently reported for public school students only and are broken down by several demographic groupings of students. When NAEP is conducted at the state level (i.e., in mathematics, reading, science, and writing), results are also reported for the nation. The national NAEP sample is then composed of all the state samples of public school students, as well as a national sample of nonpublic school students. If there are states that do not participate, a certain number of schools and students are selected to complete the national-level sample.

For assessments conducted at the national level only, samples are designed to be representative of the nation as a whole. Data are reported for public and nonpublic school students as well as for several major demographic groups of students.

Read technical information about the differences in the sample selection for state and national assessments in NAEP Assessment Sample Design

What are the goals of the NAEP program?

NAEP has two major goals: to compare student achievement in states and other jurisdictions and to track changes in achievement of fourth-, eighth-, and twelfth-graders over time in mathematics, reading, writing, science, and other content domains. To meet these dual goals, NAEP selects nationally representative samples of students who participate in either the main NAEP assessments or the long-term trend NAEP assessments.

For technical aspects of reporting student achievement, see Analysis and Scaling for NAEP.

Is participation in NAEP voluntary?

Federal law specifies that NAEP is voluntary for every student, school, school district, and state. However, federal law also requires all states that receive Title I funds to participate in NAEP reading and mathematics assessments at fourth and eighth grades. Similarly, school districts that receive Title I funds and are selected for the NAEP sample are also required to participate in NAEP reading and mathematics assessments at fourth and eighth grades. All other NAEP assessments are voluntary. Learn more about NAEP and why participation is important.

Are the data confidential?

Federal law dictates complete privacy for all test takers and their families. Under the National Assessment of Educational Progress Authorization Act (Public Law 107-279 III, section 303), the Commissioner of the National Center for Education Statistics (NCES) is charged with ensuring that NAEP tests do not question test-takers about personal or family beliefs or make information about their personal identity publicly available.

After publishing NAEP reports, NCES makes data available to researchers but withholds students’ names and other identifying information. The names of all participating students are not allowed to leave the schools after NAEP assessments are administered. Because it might be possible to deduce from data the identities of some NAEP schools, researchers must promise, under penalty of fines and jail terms, to keep these identities confidential.

For technical details, read about Questionnaires and Tracking Forms and Non-Cognitive Items in Student Booklets.

Who are the students assessed by NAEP?

The national results are based on a representative sample of students in public schools, private schools, Bureau of Indian Education schools, and Department of Defense schools. Private schools include Catholic, Conservative Christian, Lutheran, and other private schools. The state results are based on public school students only. The main NAEP assessment is usually administered at grades 4 and 8 (at the state level) plus grade 12 at the national level. The long-term trend assessments report national results (in mathematics and reading only) for age samples 9, 13, and 17 in public and nonpublic schools.

For technical details, read about the NAEP Assessment Sample Design.

Who evaluates NAEP?

Because NAEP findings have an impact on the public’s understanding of student academic achievement, precautions are taken to ensure the reliability of these findings. In its current legislation, as in previous legislative mandates, Congress has called for an ongoing evaluation of the assessment as a whole. In response to these legislative mandates, the National Center for Education Statistics (NCES) has established various panels of technical experts to study NAEP, and panels are formed periodically by NCES or external organizations, such as the National Academy of Sciences, to conduct evaluations. The Buros Center for Testing, in collaboration with the University of Massachusetts/Center for Educational Assessment and the University of Georgia, more recently conducted an external evaluation of NAEP.

For technical aspects of reporting student achievement, see Analysis and Scaling for NAEP.

How do I know what publications are available from NAEP and how do I get them?

The NAEP Publications page is accessible via the Publications link at the top of every screen.

Printed copies of NAEP publications can be ordered by contacting:
http://edpubs.ed.gov
Phone: (877) 4-ED-PUBS (433-7827)
TDD/TTY: (877) 576-7734
Mail: Ed Pubs, U.S. Department of Education, P.O. Box 22207, Alexandria, VA 22304
Para español, llame al (877) 433-7827

It is important to understand what the NEAP is because there are attempts to use the test as a predictive tool.

Sarah D. Sparks reports in the Education Week article, Can NAEP Predict College Readiness?

College Indicators

For college, at least, there are signs NAEP performance may be linked to how well a student will do in initial coursework. Researchers from WestEd, a San Francisco-based research group working under contract to the governing board, found that the 12th grade reading and math tests cover content very similar to that of the SAT.

Moreover, a 2009 study of more than 15,000 12th graders who took both the national assessment and the SAT showed that performing at the proficient level on the math NAEP was associatedRequires Adobe Acrobat Reader with an 80 percent chance of earning 500 points out of a possible 800 on the math portion of the SAT, and that the proficient level in reading was associated with a 50-50 chance of scoring 500 on the SAT verbal test.

The SAT has internally pegged a score of 500 to earning at least a B-minus in freshman-level college courses.

NAEP 12th grade content less closely mirrored that used in the ACT, the nation’s other major college-entrance exam; in particular, some arithmetic and applied-math items on the ACT would be covered in more depth on the 8th grade than the 12th grade NAEP in math. NAEP has not been able to compare its performance levels to those in the ACT, though Ms. Orr said the board plans to do so during the 2013 studies, which will also include more state-specific analyses.

Individual states’ data are likely to be critical, North Carolina’s Mr. Fabrizio said, because course requirements vary widely from state to state and even between college systems within the same state.

Hazy Work Picture

The connection between NAEP and preparation for careers that don’t require a four-year college degree is much more tenuous.

The governing board found less overlap between NAEP 12th grade content and that covered on the career-related WorkKeys test, also by ACT Inc. Last spring, panels of professional trainers in five careers—computer-support specialists, automotive master technicians, licensed practical nurses, pharmacy staff, and heating, ventilation, and air conditioning technicians—could not agreeRequires Adobe Acrobat Reader on what proficiency level on NAEP would indicate a student was ready for his or her field.

They did agree, however, that most of the content on the test wouldn’t say much about students’ potential in those fields.

For example, “there are hardly any test items in the pool at 12th grade that are applied, based on some use of mathematics rather than theoretical stuff,” said Jeremy Kilpatrick, a co-author of the studyRequires Adobe Acrobat Reader and a mathematics education professor at the University of Georgia in Athens. “Where there were such items, the career and technical people were really happy to see that—but most times they looked at the questions and said, ‘This is not relevant to what we want.'”

The assessment governing board will try to bring more clarity around job skills next year, with an analysis that compares the skills and knowledge covered in job-training programs in the five career areas with the math and reading content in the 12th grade NAEP tests.

Still, Ms. Orr was less hopeful about whether NAEP will be useful for gauging career readiness. http://www.edweek.org/ew/articles/2012/09/12/03nagb.h32.html?tkn=WYYFs6Fvb3qWQ7tlD%2B4kB4B80di2bmJy6Rje&intc=es

Moi wrote about testing in More are questioning the value of one-size-fits-all testing:

The goal of education is of course, the educate students. Purdue University has a concise synopsis of Bloom’s Taxonomy which one attempt at describing education objectives:

Bloom’s (1956) Taxonomy of Educational Objectives is the most renowned description of the levels of cognitive performance. The levels of the Taxonomy and examples of activities at each level are given in Table 3.3. The levels of this taxonomy are considered to be hierarchical. That is, learners must master lower level objectives first before they can build on them to reach higher level objectives. http://education.calumet.purdue.edu/vockell/edPsybook/Edpsy3/edpsy3_bloom.htm

See, Bloom’s Taxonomy http://en.wikipedia.org/wiki/Bloom%27s_Taxonomy More and more people are asking if testing really advances the goals of education or directs testing’s objectives, which may or may not be the same as the goals of education. https://drwilda.com/2012/02/20/more-are-questioning-the-value-of-one-size-fits-all-testing/

Related:

What , if anything, do education tests mean? https://drwilda.wordpress.com/2011/11/27/what-if-anything-do-education-tests-mean/

Complete College America report: The failure of remediation https://drwilda.wordpress.com/2012/06/21/complete-college-america-report-the-failure-of-remediation/

What the ACT college readiness assessment means https://drwilda.com/2012/08/25/what-the-act-college-readiness-assessment-means/

Dr. Wilda says this about that ©

ACT to assess college readiness for 3rd-10th Grades

4 Jul

There have been a number of cheating scandals over the past couple of years. Benjamin Herold has a riveting blog post at The Notebook which describes itself as “An independent voice for parents, educators, students, and friends of Philadelphia Public Schools.” In the post, Confession of A Cheating Teacher Herold reports:

She said she knows she’s a good teacher.

But she still helped her students cheat.

“What I did was wrong, but I don’t feel guilty about it,” said a veteran Philadelphia English teacher who shared her story with the Notebook/NewsWorks.

During a series of recent interviews, the teacher said she regularly provided prohibited assistance on the Pennsylvania System of School Assessment (PSSA) exams to 11th graders at a city neighborhood high school. At various times, she said, she gave the students definitions for unfamiliar words, discussed with students reading passages they didn’t understand, and commented on their writing samples.

On a few occasions, she said, she even pointed them to the correct answers on difficult questions.

They’d have a hard time, and I’d break it down for them,” said the teacher matter-of-factly.

Such actions are possible grounds for termination. As a result, the Notebook/NewsWorks agreed to protect her identity.

The teacher came forward following the recent publication of a 2009 report that identified dozens of schools across Pennsylvania and Philadelphia that had statistically suspicious test results. Though her school was not among those flagged, she claims that adult cheating there was “rampant.”

The Notebook/NewsWorks is also withholding the name of her former school. because the details of her account have been only partially corroborated.

But her story seems worth telling.

During multiple conversations with the Notebook/NewsWorks, both on the phone and in person, the teacher provided a detailed, consistent account of her own actions to abet cheating. Her compelling personal testimonial highlighted frequently shared concerns about the conditions that high-stakes testing have created in urban public schools. The Notebook and NewsWorks believe that her confession sheds important light on the recent spate of cheating scandals across the country….

She said she knows she’s a good teacher.

But she still helped her students cheat.

“What I did was wrong, but I don’t feel guilty about it,” said a veteran Philadelphia English teacher who shared her story with the Notebook/NewsWorks.

During a series of recent interviews, the teacher said she regularly provided prohibited assistance on the Pennsylvania System of School Assessment (PSSA) exams to 11th graders at a city neighborhood high school. At various times, she said, she gave the students definitions for unfamiliar words, discussed with students reading passages they didn’t understand, and commented on their writing samples.

On a few occasions, she said, she even pointed them to the correct answers on difficult questions.

They’d have a hard time, and I’d break it down for them,” said the teacher matter-of-factly.

Such actions are possible grounds for termination. As a result, the Notebook/NewsWorks agreed to protect her identity.

The teacher came forward following the recent publication of a 2009 report that identified dozens of schools across Pennsylvania and Philadelphia that had statistically suspicious test results. Though her school was not among those flagged, she claims that adult cheating there was “rampant.”

The Notebook/NewsWorks is also withholding the name of her former school. because the details of her account have been only partially corroborated.

But her story seems worth telling.

During multiple conversations with the Notebook/NewsWorks, both on the phone and in person, the teacher provided a detailed, consistent account of her own actions to abet cheating. Her compelling personal testimonial highlighted frequently shared concerns about the conditions that high-stakes testing have created in urban public schools. The Notebook and NewsWorks believe that her confession sheds important light on the recent spate of cheating scandals across the country.

One might ask what the confessions of a cheating teacher have to do with the announcement by ACT that they will begin offering a series of assessments to measure skills needed in high school and college. Although, it is in the early stage of development, one could question whether this assessment will turn into a high-stakes test with pressures on students, teachers, and schools. Admittedly, it is early.

Caralee Adams writes in the Education Week article, ACT to Roll Out Career and College Readiness Tests for 3rd-10th Grades:

ACT Inc. announced today that it is developing a new series of assessments for every grade level, from 3rd through 10th, to measure skills needed in college and careers.

The tests, which would be administered digitally and provide instant feedback to teachers, will be piloted in states this fall and scheduled to be launched in 2014, says Jon Erickson, the president of education for ACT, the Iowa City, Iowa-based nonprofit testing company.

The “next generation” assessment will be pegged to the Common Core State Standards and cover the four areas now on the ACT: English, reading, math, and science.

“It connects all the grades—elementary school through high school—to measure growth and development,” says Erickson. “It informs teaching, as students progress, to intervene at early ages.”

The assessment would look beyond academics to get a complete picture of the whole student, he says. There would be interest inventories for students, as well as assessment of behavioral skills for students and teachers to evaluate.

It will fill a niche as the first digital, longitudinal assessment to connect student performance across grades, both in and out of the classroom, according to the ACT. The hope is to get information on students’ weaknesses and strengths earlier so teachers can make adjustments to improve their chances of success.

ACT has not arrived at a cost for the assessment system, but it intends to offer it in modules for states, districts, or schools to buy to administer to all students. As a nonprofit organization, Erickson says ACT wants to keep pricing affordable and at the lowest price acceptable to states. Teachers could choose to use all or part of the assessment, likely in the classroom during the typical school day. ACT is still field-testing the system so the length of the assessment is not set.

With digital delivery of the test, students would have automatic scoring and real-time assessments, says Erickson. (There would be pencil-and-paper testing to accommodate schools that would not be equipped with computers.) The assessment would include a combination of multiple-choice, open-response, and interactive items that would incorporate some creativity into testing, he adds. It would be both formative and summative for accountability purposes….

Just how states might use the new assessment is uncertain. It could replace the current state test, be given as a lead-up to the test, or used as a supplement for it, he says.

ACT developed the test in response to needs expressed by states to improve college and career readiness, says Erickson. Providing integrated testing from elementary to high school, with the ACT as the capstone in 11th grade, “will be a game changer,” he adds. http://blogs.edweek.org/edweek/college_bound/2012/07/act_plans_to_roll_out_career_and_college_readiness_tests_for_3rd-10th_grades.html?intc=es

There is no magic bullet or “Holy Grail” in education. There is only what works to produce academic achievement in each population of children. That is why school choice is so important.

There must be a way to introduce variation into the education system. The testing straightjacket is strangling innovation and corrupting the system. Yes, there should be a way to measure results and people must be held accountable, but relying solely on tests, especially when not taking into consideration where different populations of children are when they arrive at school is lunacy.

Related:

Early learning standards and the K-12 continuum https://drwilda.wordpress.com/2012/01/03/early-learning-standards-and-the-k-12-contiuum/

Jonathan Cohn’s ‘The Two Year Window’ https://drwilda.wordpress.com/2011/12/18/jonathan-cohns-the-two-year-window/

What , if anything, do education tests mean? https://drwilda.wordpress.com/2011/11/27/what-if-anything-do-education-tests-mean/

Dr. Wilda says this about that ©