Home | Yearly News Archive | Advertisers | Blog | Contact Us |
|
Sunday, November 24, 2024 |
|
Report Cards are in .. |
Post Reply | Page <12 |
Author | |
acclaro
Prominent MUSA Citizen Joined: Jul 01 2009 Status: Offline Points: 1878 |
Post Options
Thanks(0)
|
All I can think of to describe the system/ district as Ms. Andrew sees it, as I am a golfer, is "grinding" the round out.
|
|
'An appeaser is one who feeds a crocodile, hoping it will eat him last.' - Winston Churchill
|
|
VietVet
MUSA Council Joined: May 15 2008 Status: Offline Points: 7008 |
Post Options
Thanks(0)
|
Ms. Andrew, with all due respect, I believe the statements..... "Rosa Parks Elementary was the only Middletown elementary school not to receive an “F” for meeting the state standard in test scores". And..... "The school has only met one other indicator in the past — fourth-grade writing in 2004." trumps your "Value Added" card. Ms. Andrew, the students "learning more than a year's worth in a year" means nothing if they can't retain the information to pass the tests. THE TESTS DETERMINE HOW EFFECTIVELY THEY RETAINED THE KNOWLEDGE THEY LEARNED. IF THEY CAN'T RETAIN THE KNOWLEDGE TO DEMONSTRATE A PASSING GRADE ON RECALL, WHAT GOOD DID "LEARNING MORE THAN A YEAR'S WORTH IN A YEAR" DO? IT SERVES NO PURPOSE IF IT CAN'T BE RECALLED. |
|
I'm so proud of my hometown and what it has become. Recall 'em all. Let's start over.
|
|
Marcia Andrew
MUSA Citizen Joined: Jan 09 2010 Status: Offline Points: 365 |
Post Options
Thanks(0)
|
Vet, your answer to my question surprises me. I had assumed that you deleted from the article the statement that 8 of 9 Middletown city schools received an A for overall Value Added because you are too stubborn to give credit for any success because it doesn't fit your narrative. We all agree that F is a bad grade, but you can't bring yourself to agree that an A is a good grade.
But, your answer leads me to think that the reason you left out the good results is that you do not understand what the Value Added score is measuring. You said, "THE TESTS DETERMINE HOW EFFECTIVELY THEY RETAINED THE KNOWLEDGE THEY LEARNED. IF THEY CAN"T RETAIN THE KNOWLEDGE TO DEMONSTRATE A PASSING GRADE ON RECALL< WHAT DOOD DID "LEARNING MORE THAN A YEAR"S WORTH IN A YEAR" DO?" The Value Added Score is derived from the same tests as the score for the number of Indicators Met and the score for the Performance Index. The difference is, Indicators Met and Performance Index only look at a snapshot of how the students scored on one day in April 2013. The Value Added score compares how the students did one day in April 2013 to how the same students did one day in April 2012. Put another way, in response to your comment, Indicators Met and Performance Index measure how effectively the students retained the knowledge learned to demonstrate recall on the tests, whereas Value Added asks, did the student retain (and recall for the test) more knowledge this year than last year? To think about how schools can earn an A on Value Added while getting an F for Indicators Met and a C for Performance Index, let me try to give a few hypothetical examples. Example 1: In 2012, Student 1 had the correct answer to 20% of the questions on the 3rd grade reading test. That student was not "proficient" in reading so the district does not have 75% or more of all students proficient on the 3rd grade reading test, and does not meet that Indicator. A 20% score is the lowest level, Limited, earning the fewest points toward the Performance Index score for the district. In 2013, Student 1's reading ability improved quite a bit, and he answered 55% of the questions right on the 4th Grade Reading test. This raised Student 1 to the next of 5 levels, Basic, but is still not Proficient, so does not add to the percentage of students district-wide who scored Proficient or higher on the 4th Grade Reading. So, despite the improvement of imaginary Student 1 (multiplied by other students), the number of tests where the district has 75% or more students pass does not change, so the number of Indicators Met does not change. In this hypothetical, the district's Performance Index would go up a little, because a Basic score earns more points than a Limited score, although both are still fewer points than Proficient, Accelerated or Advanced. Example 2: In 2012, Students 2 through 11 answer 60% of the questions right on 3rd grade reading, and do not pass -- they score Basic, not Proficient. The district does not earn the Indicator Met for 3rd grade reading because only 70% of the students are Proficient or higher. In 2013, Students 2 through 11 answer 70% of the questions right on 4th grade reading, so their scores improve to Proficient. These 10 students are 10% of the 4th graders so (assuming for the example that every other 4th grader scored on 4th grade reading at the same level they scored on 3rd grade reading the prior year) the percentage of students district wide increases to 72% on 4th grade reading. However, the magic cut off of 75% is not reached, so the district does not earn that indicator, and its score on Indicators Met stays at F. The District's Performance Index would go up a little bit because these Proficient scores earn more points than Basic scores. So, while I have over-simplified the examples, this is what has been happening in the Middletown schools.. For at least the last 3 years, Middletown has earned Above Value Added growth. Some students are increasing their individual scores on the state tests, but still below proficient, while others are achieving proficient for the first time but the district-wide results for that grade level have not crossed the 75% threshold to earn the Indicator Met. (Or, in some grade level/subjects, the percentage is hovering right around 75%, and small changes in the number of students proficient each year cause the number to move just below or just above 75% versus the prior year, with the resulting swing from 5 Indicators Met (2010) to 10 Indicators Met (2011) to 6 Indicators Met (2012) to 8 Indicators Met (2013). I would also like to point out that the percentage of students scoring Accelerated or Advanced has steadily increased each of the last 5 years as well. This does not impact the number of Indicators Met but it does increase the Performance Index score, because Accelerated and Advanced scores earn more points. So, over the last 5 years, Middletown's Performance Index score has improved from 81.7 to 88.5, which reflects these various types of growth. By the way, the answer to my question about Hamilton City Schools, which you ignored, is that 7 of 10 school buildings received an F for Value Added, meaning their students on average learned less than a year's worth in a year. This backwards progress is reflected in the district's Performance Index score, which went down from 93.7 in 2012 to 91.5 in 2013.
|
|
Marcia Andrew
MUSA Citizen Joined: Jan 09 2010 Status: Offline Points: 365 |
Post Options
Thanks(0)
|
To respond to several posters who made reference to Franklin's results (at least one person erroneously called Franklin's results "improvement") and asking how, given what they say are similar demographics, why can't Middletown do what Franklin is doing?
First, Franklin did not improve. Franklin's Performance Index went down from 98.5 in 2012 to 97.9 in 2013. Franklin scored a C on overall Value Added as a District..I agree that Franklin's B for Performance Index is better than Middletown's C, and that Franklin's B for Indicators Met is better than Middletown's F. Second, Franklin's demographics are not similar to Middletown's. Middletown had 71% economically disadvantaged students, compared to 50% in Franklin; Middletown had 35% non-white students, compared to 5% in Franklin; Middletown had 5% students for whom English is their second language, compared to fewer than 10 students in the entire Franklin distirct. These are not excuses, they are just the facts that the schools have to deal with.
|
|
Marcia Andrew
MUSA Citizen Joined: Jan 09 2010 Status: Offline Points: 365 |
Post Options
Thanks(0)
|
Acclaro, I am not a golfer, so I cannot respond directly to your comment about "grinding it out." However, to borrow your metaphor, if your average golf score for a round of 18 holes changed from 100 to 90, wouldn't you say your golf game had improved, even while admitting you weren't ready to go pro?
As a golfer, do you think it would be fair to judge your ability level as a golfer by your score on one particular day, regardless of how you were feeling that day? Don't golfers keep a lifetime handicap based on the average of their scores over dozens, if not hundreds of rounds? Do you think it would be fair to judge how your ability to play golf has improved or declined year to year by looking only at your score on that one particular day each spring, but making you play a different course each year while giving you only some vague information about which course you would play on the day that counted for your annual score? Let's say you take lessons from a pro. Do you think it would be fair to judge his ability as a pro by whether and how much your handicap improves? Do you think it would be fair to judge his ability as a pro by whether and how much the handicap of all of his clients improves, even if he can't choose who to take on as his client, and his client base includes golfers with physical and mental disabilities and others who have no hand/eye coordination and only show up at the golf course under threat of punishment? My questions may sound ridiculous, but this is how public education is judged in the U.S. |
|
Neil Barille
MUSA Resident Joined: Jul 07 2010 Status: Offline Points: 238 |
Post Options
Thanks(0)
|
Thank you for your explanations Ms. Andrew. You are an asset to the board. As for the results, it think it should be pretty obvious to most people that the primary problem with the performance numbers of the MCSD has more to do with the student body and their situations than it does the staff, the administrators, the teaching methods, etc. Only so much can be done.
|
|
VietVet
MUSA Council Joined: May 15 2008 Status: Offline Points: 7008 |
Post Options
Thanks(0)
|
MS. ANDREW:
"Vet, your answer to my question surprises me. I had assumed that you deleted from the article the statement that 8 of 9 Middletown city schools received an A for overall Value Added because you are too stubborn to give credit for any success because it doesn't fit your narrative. We all agree that F is a bad grade, but you can't bring yourself to agree that an A is a good grade" YOU ARE 100% CORRECT MS. ANDREW. I AM VERY STUBBORN. ALWAYS HAVE BEEN. YOU CAN ADD ARGUMENTATIVE TO THAT TOO. THERE YOU ARE. I'M HUMAN AND HAVE FLAWS. MOST DO. NOW, IF WE CAN JUST GET PEOPLE IN AUTHORITY TO ADMIT THE SAME THING, WE COULD ACCOMPLISH SOMETHING. YES, THE "A" IS A GOOD GRADE. USUALLY IS. BUT THE COLUMN THE A'S ARE IN, IE. VALUE ADDED, ONLY TELLS US WHAT THE KID LEARNED, OR DIDN'T LEARN DURING A DESIGNATED TIME FRAME. WHAT DOES THAT MEAN AS TO CONTRIBUTION TOWARD THE TESTING ASPECT OF EDUCATION? APPARENTLY NOTHING. LOOK AT THE RESULTS IN THE OTHER TWO COLUMNS. MUST BE LITTLE RETENTION OF WHAT WAS LEARNED IN "VALUE-ADDED" AT TEST TIME. IE....THE SUCCESS YOU HAD IN VALUE ADDED DID NOT MAKE THE TRANSITION, NOR HELP THE OVERALL PERFORMANCE. MS. ANDREW: "Put another way, in response to your comment, Indicators Met and Performance Index measure how effectively the students retained the knowledge learned to demonstrate recall on the tests, whereas Value Added asks, did the student retain (and recall for the test) more knowledge this year than last year?" OK, "INDICATORS MET" AND "PERFORMANCE INDEX" HAVE VERY POOR RESULTS FOR THIS DISTRICT BY THE DATA WE SEE. "VALUE ADDED" HAS RESPECTABLE SCORES ACCORDING TO THE DATA. SO, THE STUDENTS WERE POOR AT RETAINING THE KNOWLEDGE LEARNED TO DEMOSTRATE RECALL AND DID BETTER THIS YEAR THAN LAST ON RECALLING TEST INFO? WHAT? CONFUSING TO SAY THE LEAST. THROW OUT THE VALUE ADDED AND JUST GIVE US THE TEST DATA RESULTS FROM YEAR TO YEAR. VALUE ADDED IS THERE AS A RESULT OF OVERANALYZING THE DATA. INDICATORS MET AND PERFORMANCE NUMBERS IS REALLY ALL WE NEED, ISN'T IT? AREN'T WE GRASPING AT STRAWS HERE LOOKING FOR ANY RAY OF SUNSHINE WE CAN FIND WITH THIS CURRENT METHOD OF EVALUATION? WE BOTH HAVE NARRATIVES TO THE INFORMATION AT HAND. WE ARE BOTH LOOKING AT THE SAME DATA, BUT HAVE ARRIVED AT ENTIRELY DIFFERENT CONCLUSIONS. YOU ARE HOLDING THE "VALUE-ADDED" COLUMN ON THE CHART AS THE SOLE DETERMINING FACTOR ON SUCCESS AND I AM LOOKING AT ALL THOSE PESKY LITTLE C, D's AND F'S IN THE OTHER TWO COLUMNS. THAT BEING SAID, WHICH COLUMN IS THE MOST IMPORTANT?........THAT IS LEFT TO THE PURVEYOR OF THE DATA. AND FINALLY.... "By the way, the answer to my question about Hamilton City Schools, which you ignored, is that 7 of 10 school buildings received an F for Value Added, meaning their students on average learned less than a year's worth in a year" "THEIR STUDENTS ON AVERAGE LEARNED LESS THAN A YEAR'S WORTH IN A YEAR" .....THAT SAYS IT ALL MS. ANDREW. NONSENSICAL PROGRAM USING SILLY TERMS AND CRITERIA. WHY THIS KIND OF REASONING TO EVALUATE KIDS? WHOEVER CREATED THIS JUMBLED MESS NEEDS TO GO BACK TO SCHOOL THEMSELVES FOR A LARGE DOSE OF SOME COMMON SENSE. AND JUST HOW WOULD ONE MAKE AN ADJUSTMENT IF THEY KNEW THE KIDS WERE LEARNING "LESS THAN A YEAR'S WORTH IN A YEAR"? HOW WOULD YOU BE ABLE TO SELF-EVALUATE TO HAVE TIME TO CORRECT? |
|
I'm so proud of my hometown and what it has become. Recall 'em all. Let's start over.
|
|
Marcia Andrew
MUSA Citizen Joined: Jan 09 2010 Status: Offline Points: 365 |
Post Options
Thanks(0)
|
Vet, I think you are still not understanding that the Value Added score comes from the same standardized tests as the Indicators Met and Performance Indicator. There is only one set of tests; the OHio dept of education analyzes the data from those tests in many different ways to come up with these different scores/grades. The growth that is recognized as Value Added is reflected in the "testing aspect of education."
It is not my view that Value Added is the only measure that matters. Each of the grades tells us something. I will readily admit to being stubborn and argumentative as well. However, I try to keep an open mind and not jump to conclusions until I have all the information. I did not come up with this state grading system and don't defend it. However, as to why it is helpful to look at progress of the same student from one year to the next, let me throw out another analogy. Say you were going to buy stock. Company A and B both have earnings per share of $40. But Company A's earnings per share have risen steadily from $25 to $40 over the last 3 years, while Company B's earnings per share have dropped over the same time period from $55 to $40. Trend data tells you a different story about those two companies that you won't see if you only look at a snapshot of how they did in the most recent year.
|
|
VietVet
MUSA Council Joined: May 15 2008 Status: Offline Points: 7008 |
Post Options
Thanks(0)
|
Fair enough Ms. Andrew. To avoid a subject that has been beaten to pieces here, and to assume that we will agree to disagree on what is important, I will not offer any comments toward your reply.
I will finish by saying that I find the educational criteria for grading students nowadays to be a conglomeration of confused diatribe that can offer anything the interpreter of the data wants it to be. Selections of what is important are muddled at best and the designers didn't even attempt to make it user friendly. It was put together by a group of people (be it political or academic) who don't have a clue how to simplify systems and apparently feel that useless data in certain areas matters. This grading criteria just flat out overworks the data into fine little bits of numbers or letters. |
|
I'm so proud of my hometown and what it has become. Recall 'em all. Let's start over.
|
|
ohiostorm
MUSA Immigrant Joined: Feb 12 2010 Location: Middletown, Ohi Status: Offline Points: 20 |
Post Options
Thanks(0)
|
I guess the frustration is, Hooray we improved a bit last year in a one year snapshot. However we are still failing in most aspects of providing an education for the children of the district. In the interest of beating a dead horse......what about the performance of Central Academy.While I realize that this school has a different approach, it obviously is not meeting the standards of education that have been set. How do we justify the money spent to have this additional school? Can we maybe absorb the students into their neighborhood elementary and spend the money saved on more education options to bring up the education level in the district as a whole? |
|
Bill
MUSA Citizen Joined: Nov 04 2009 Status: Offline Points: 710 |
Post Options
Thanks(0)
|
ohiostorm, so you want to eliminate one educational option, Central Academy, in order to have more money for educational options. Huh?
I agree with others that say a large part of the reason for our standardized test results is our kids not the teachers, the teaching methods, or the options. I suppose SOME of those things play a role. And improvements should always be sought but to pretend that the test data is some sort of indictment of the district's staff is not realistic.
|
|
ohiostorm
MUSA Immigrant Joined: Feb 12 2010 Location: Middletown, Ohi Status: Offline Points: 20 |
Post Options
Thanks(0)
|
I am saying that the Central Academy approach needs to be evaluated for it's effectiveness. I would like some answers as to why we would continue an experimental approach to education like Central when it continues to score lower then the standard approaches that are being made that Ms. Andrews has highlighted as a huge success.
|
|
Marcia Andrew
MUSA Citizen Joined: Jan 09 2010 Status: Offline Points: 365 |
Post Options
Thanks(0)
|
OhioStorm wrote, "we are still failing in most aspects of providing an education for the children of the district." I cannot agree with that interpretation of the state report card results. I agree that for those children who did not score proficient or higher, many of them are capable of proficiency and I would agree that we have failed those children to some degree. However, some of the students who scored below proficient are children who have severe learning impairments/mental retardation, and it is not realistic to expect that they will ever pass a grade level test, even though they are continuing to learn each year. I don't think we have failed those students. And we certainly haven't failed the vast majority of students, who score proficient or higher on the state tests. Just as an example, 69.5% of third graders passed the math test and 73.6% of third graders passed the reading test. That is not "failing in most aspects."
OhioStorm also wrote, "What about the performance of Central Academy. . . it obviously is not meeting the standards of education that have been set." Central Academy's grade of F for Value Added is definitely a concern and it is being addressed. It is not consistent with past results; Central has generally in the past been one of the highest scoring elementaries in Middletown. This year's results varied greatly depending on the grade level. Central's 3rd grade and 7th grade both out-performed the district average by large margins in both reading and math. (83.3% of Central 3rd graders scored proficient or higher on reading, compared to 73.6% district-wide; 91.7% of Central 3rd graders scored proficient or higher on math, compared to 69.5% district-wide; 90.3% of Central 7th graders scored proficient or higher on reading, compared to 67.2% district-wide; 74.2% of Central 7th graders scored proficient or higher on math, compared to 63.8% district-wide.) (All 4 of those results also exceeded state-wide averages, btw). In 4th grade, however, the opposite happened (72.9% of Central 4th graders scored proficient or higher in reading, compared to 82.3% district-wide; just 54.2% of Central 4th graders scored proficient or higher on math, compared to 70.6% district-wide). Central's results for grades 5, 6 and 8 compared to the rest of the district were more mixed. So, there were definitely some issues last year that the district has made moves to fix, but where some of these results exceed 90% passing, the data does not support your conclusion that the school "is not meeting the standards of education." I encourage any one who is interested in more detailed breakdowns of scores by school or by grade level to go the Ohio department of education web site; there are links on the home page to search for state report card results that include lots of charts and graphs. As to the suggestion that money could be saved by closing Central, I don't think that we could close the building. The elementary buildings are full as we already reduced the number from 8 to 7 a couple of years ago, when we closed Verity and re-purposed Highview as the 6th Grade Center.
|
|
Pacman
Prominent MUSA Citizen Joined: Jun 02 2007 Status: Offline Points: 2612 |
Post Options
Thanks(0)
|
John Stossel - Stupid in America |
|
Post Reply | Page <12 |
Tweet
|
Forum Jump | Forum Permissions You cannot post new topics in this forum You cannot reply to topics in this forum You cannot delete your posts in this forum You cannot edit your posts in this forum You cannot create polls in this forum You cannot vote in polls in this forum |
This page was generated in 0.082 seconds.
Copyright ©2024 MiddletownUSA.com | Privacy Statement | Terms of Use | Site by Xponex Media | Advertising Information |