Post by doctorwho on Jun 14, 2010 9:39:05 GMT -6
because they ranked the schools on ONE factor only - how many ap tests taken--forget ACT- SAT or any other factor-- all magnet schools ( except select inner city schools) were excluded as were private schools.
Could the headline be more misleading ?
America's Best High Schools 2010
you want to call them progressive / most challenging for average and below students etc- fine.. but this headline is bogus.
and I thought 204 was great at spinning stuff- this is amazing...
www.newsweek.com/feature/2010/americas-best-high-schools.html
Here is the explanation of the criteria:
3. Why do you count only the number of tests given, and not how well the students do on the tests?
In the past, schools have often bragged of their high passing rates on AP or IB as a sign of how well their programs were doing. When I say passing rate, I mean the percentage of students who scored 3, 4, or 5 on the five-point AP test or 4, 5, 6, or 7 on the seven-point IB test. (The Cambridge tests, although similar to AP and IB, are used in very few schools, and rarely appear in school assessments.) Some IB exams are composed of several separate sections, called "papers," but we only count one exam per IB course. Passing AP or IB scores are the rough equivalent of a C or C-plus in a college course and make the student eligible for credit at many colleges.
I decided not to count passing rates in the way schools had done in the past because I found that most American high schools kept those rates artificially high by allowing only top students to take the courses. In other instances, they opened the courses to all but encouraged only the best students to take the tests.
AP, IB, and Cambridge are important because they give average students a chance to experience the trauma of heavy college reading lists and long, analytical college examinations. Studies by U.S. Department of Education senior researcher Clifford Adelman in 1999 and 2005 showed that the best predictors of college graduation were not good high-school grades or test scores but whether or not a student had an intense academic experience in high school. Such experiences were produced by taking higher-level math and English courses and struggling with the demands of college-level courses like AP or IB. Several other studies looked at hundreds of thousands of students in California and Texas and found if they had passing scores on AP exams they were more likely to do well academically in college. In the latest Texas study, even low-performing, low-income students who got only a 2 on an AP test did significantly better in college than similar students who did not take AP in high school.
To send a student off to college without having had an AP, IB, or Cambridge course and test is like insisting that a child learn to ride a bike without ever taking off the training wheels. It is dumb, and in my view a form of educational malpractice. But most American high schools still do it. I don't think such schools should be rewarded because they have artificially high AP or IB passing rates achieved by making certain just their best students take the tests.
NEWSWEEK and The Washington Post, however, have added a new statistic developed by the College Board that indicates how well students are doing on the exams at each school while still recognizing the importance of increasing student participation. It is the Equity and Excellence rate, the percentage of ALL graduating seniors, including those who never got near an AP course, who had at least one score of 3 or above on at least one AP test sometime in high school. That is the "E&E" on our list. "Subs. Lunch" on the list stands for the percentage of students who qualify for federally subsidized lunches, the best measure of the percentage of low-income students at each school.
The average Equity and Excellence rate in 2009 was 15.9 percent. In the 2010 NEWSWEEK list, we give the Equity and Excellence percentage for those schools that have the necessary data. We ask IB schools to calculate their IB, or combined AP-IB, Equity and Excellence rate, using a 4 on the seven-point IB test as the equivalent of a 3 on the AP.
4. Why do you divide by the number of graduating seniors, and does that mean you only count tests taken by seniors? Don't you know that juniors, and sometimes even sophomores and freshmen, take AP tests?
We divide by May or June graduates as a convenient measure of the relative size of each school. That way a small school like Peak to Peak Charter in Lafayette, Colo., which gave 537 AP tests and graduated only 95 seniors in 2009 for a rating of 5.653 this year, will not be at a disadvantage when compared to a big school like Highland Park in Dallas which gave 2,767 AP tests and graduated 493 seniors for a rating of 5.613. On the 2010 NEWSWEEK list they are right next to each other at numbers 32 and 33, respectively.
We count all tests given at the school, not just those taken by seniors.
5. How can you call these the best schools or the top schools if you are using just one narrow measure? High school is more than just AP or IB tests.
Indeed it is, and if I could quantify all those other things in a meaningful way, I would give it a try. But teacher quality, extracurricular activities, and other important factors are too subjective for a ranked list. Participation in challenging courses and tests, on the other hand, can be counted, and the results expose a significant failing in most high schools--less than 6 percent of the public high schools in the United States qualify for the NEWSWEEK list. I think that this is the most useful quantitative measure of a high school. One of its strengths is the narrowness of the criteria. Everyone can understand the simple arithmetic that produces a school's Challenge Index rating and discuss it intelligently, as opposed to ranked lists like U.S. News & World Report's "America's Best Colleges," which has too many factors for me to judge for myself the quality of their analysis.
As for the words "top" and "best," they are always based on criteria chosen by the list maker. My list of best film directors may depend on Academy Award nominations. Yours may be based on ticket sales. I have been clear about what I am measuring in these schools. You may not like my criteria, but I have not found anyone who understands how high schools work and does not think AP, IB, or Cambridge test participation is important. I often ask people to tell me what quantitative measure of high schools they think is more important than this one. Such discussions can be interesting and productive.
Some critics say that some of the schools on the NEWSWEEK list have low average test scores and high dropout rates and thus do not belong on any list of best high schools. My response is that these are all schools with lots of low-income students and great teachers who have found ways to get them involved in college-level courses. We have as yet no proven way for educators in low-income schools to improve significantly their average tests scores or graduation rates. Until we do, I don't see any point in making them play a game that, no matter how energetic or smart they are, they can't win.
6. Why don't I see on the NEWSWEEK list famous public high schools like Stuyvesant in New York City or Thomas Jefferson in Fairfax County, Va., or the Illinois Mathematics and Science Academy in Aurora, Ill., or Whitney High in Cerritos, Calif.?
We do not include any magnet or charter high school that draws such a high concentration of top students that its average SAT or ACT score significantly exceeds the highest average for any normal-enrollment school in the country. This year that meant such schools had to have an average SAT score below 1,975 or an average ACT score below 29 to be included on the list.
The schools you name are terrific places with some of the highest average test scores in the country, but it would be deceptive for us to put them on this list. The Challenge Index is designed to honor schools that have done the best job in persuading average students to take college-level courses and tests. It does not work with schools that have no, or almost no, average students. The idea is to create a list that measures how good schools are in challenging all students and not just how high their students' test scores are. The high-performing schools we have excluded from the list all have great teachers, but research indicates that high SAT and ACT averages are much more an indication of the affluence of the students' parents.
Using average SAT or ACT scores is a change from the previous system we used, which excluded schools that admitted more than half of their student based on grades and test scores. That system penalized some inner-city magnet schools that had high Challenge Index ratings but whose average SAT or ACT scores were below those of some normal-enrollment suburban schools, so we switched to a system that we consider fairer and clearer.
We do, however, acknowledge on our Public Elites list the schools that did not make the list because their average SAT or ACT scores were too high. This year there are 21 of them.
Could the headline be more misleading ?
America's Best High Schools 2010
you want to call them progressive / most challenging for average and below students etc- fine.. but this headline is bogus.
and I thought 204 was great at spinning stuff- this is amazing...
www.newsweek.com/feature/2010/americas-best-high-schools.html
Here is the explanation of the criteria:
3. Why do you count only the number of tests given, and not how well the students do on the tests?
In the past, schools have often bragged of their high passing rates on AP or IB as a sign of how well their programs were doing. When I say passing rate, I mean the percentage of students who scored 3, 4, or 5 on the five-point AP test or 4, 5, 6, or 7 on the seven-point IB test. (The Cambridge tests, although similar to AP and IB, are used in very few schools, and rarely appear in school assessments.) Some IB exams are composed of several separate sections, called "papers," but we only count one exam per IB course. Passing AP or IB scores are the rough equivalent of a C or C-plus in a college course and make the student eligible for credit at many colleges.
I decided not to count passing rates in the way schools had done in the past because I found that most American high schools kept those rates artificially high by allowing only top students to take the courses. In other instances, they opened the courses to all but encouraged only the best students to take the tests.
AP, IB, and Cambridge are important because they give average students a chance to experience the trauma of heavy college reading lists and long, analytical college examinations. Studies by U.S. Department of Education senior researcher Clifford Adelman in 1999 and 2005 showed that the best predictors of college graduation were not good high-school grades or test scores but whether or not a student had an intense academic experience in high school. Such experiences were produced by taking higher-level math and English courses and struggling with the demands of college-level courses like AP or IB. Several other studies looked at hundreds of thousands of students in California and Texas and found if they had passing scores on AP exams they were more likely to do well academically in college. In the latest Texas study, even low-performing, low-income students who got only a 2 on an AP test did significantly better in college than similar students who did not take AP in high school.
To send a student off to college without having had an AP, IB, or Cambridge course and test is like insisting that a child learn to ride a bike without ever taking off the training wheels. It is dumb, and in my view a form of educational malpractice. But most American high schools still do it. I don't think such schools should be rewarded because they have artificially high AP or IB passing rates achieved by making certain just their best students take the tests.
NEWSWEEK and The Washington Post, however, have added a new statistic developed by the College Board that indicates how well students are doing on the exams at each school while still recognizing the importance of increasing student participation. It is the Equity and Excellence rate, the percentage of ALL graduating seniors, including those who never got near an AP course, who had at least one score of 3 or above on at least one AP test sometime in high school. That is the "E&E" on our list. "Subs. Lunch" on the list stands for the percentage of students who qualify for federally subsidized lunches, the best measure of the percentage of low-income students at each school.
The average Equity and Excellence rate in 2009 was 15.9 percent. In the 2010 NEWSWEEK list, we give the Equity and Excellence percentage for those schools that have the necessary data. We ask IB schools to calculate their IB, or combined AP-IB, Equity and Excellence rate, using a 4 on the seven-point IB test as the equivalent of a 3 on the AP.
4. Why do you divide by the number of graduating seniors, and does that mean you only count tests taken by seniors? Don't you know that juniors, and sometimes even sophomores and freshmen, take AP tests?
We divide by May or June graduates as a convenient measure of the relative size of each school. That way a small school like Peak to Peak Charter in Lafayette, Colo., which gave 537 AP tests and graduated only 95 seniors in 2009 for a rating of 5.653 this year, will not be at a disadvantage when compared to a big school like Highland Park in Dallas which gave 2,767 AP tests and graduated 493 seniors for a rating of 5.613. On the 2010 NEWSWEEK list they are right next to each other at numbers 32 and 33, respectively.
We count all tests given at the school, not just those taken by seniors.
5. How can you call these the best schools or the top schools if you are using just one narrow measure? High school is more than just AP or IB tests.
Indeed it is, and if I could quantify all those other things in a meaningful way, I would give it a try. But teacher quality, extracurricular activities, and other important factors are too subjective for a ranked list. Participation in challenging courses and tests, on the other hand, can be counted, and the results expose a significant failing in most high schools--less than 6 percent of the public high schools in the United States qualify for the NEWSWEEK list. I think that this is the most useful quantitative measure of a high school. One of its strengths is the narrowness of the criteria. Everyone can understand the simple arithmetic that produces a school's Challenge Index rating and discuss it intelligently, as opposed to ranked lists like U.S. News & World Report's "America's Best Colleges," which has too many factors for me to judge for myself the quality of their analysis.
As for the words "top" and "best," they are always based on criteria chosen by the list maker. My list of best film directors may depend on Academy Award nominations. Yours may be based on ticket sales. I have been clear about what I am measuring in these schools. You may not like my criteria, but I have not found anyone who understands how high schools work and does not think AP, IB, or Cambridge test participation is important. I often ask people to tell me what quantitative measure of high schools they think is more important than this one. Such discussions can be interesting and productive.
Some critics say that some of the schools on the NEWSWEEK list have low average test scores and high dropout rates and thus do not belong on any list of best high schools. My response is that these are all schools with lots of low-income students and great teachers who have found ways to get them involved in college-level courses. We have as yet no proven way for educators in low-income schools to improve significantly their average tests scores or graduation rates. Until we do, I don't see any point in making them play a game that, no matter how energetic or smart they are, they can't win.
6. Why don't I see on the NEWSWEEK list famous public high schools like Stuyvesant in New York City or Thomas Jefferson in Fairfax County, Va., or the Illinois Mathematics and Science Academy in Aurora, Ill., or Whitney High in Cerritos, Calif.?
We do not include any magnet or charter high school that draws such a high concentration of top students that its average SAT or ACT score significantly exceeds the highest average for any normal-enrollment school in the country. This year that meant such schools had to have an average SAT score below 1,975 or an average ACT score below 29 to be included on the list.
The schools you name are terrific places with some of the highest average test scores in the country, but it would be deceptive for us to put them on this list. The Challenge Index is designed to honor schools that have done the best job in persuading average students to take college-level courses and tests. It does not work with schools that have no, or almost no, average students. The idea is to create a list that measures how good schools are in challenging all students and not just how high their students' test scores are. The high-performing schools we have excluded from the list all have great teachers, but research indicates that high SAT and ACT averages are much more an indication of the affluence of the students' parents.
Using average SAT or ACT scores is a change from the previous system we used, which excluded schools that admitted more than half of their student based on grades and test scores. That system penalized some inner-city magnet schools that had high Challenge Index ratings but whose average SAT or ACT scores were below those of some normal-enrollment suburban schools, so we switched to a system that we consider fairer and clearer.
We do, however, acknowledge on our Public Elites list the schools that did not make the list because their average SAT or ACT scores were too high. This year there are 21 of them.