The Prevailing Delusion About Online College Degrees

College community services is one of the most important factors in determining a child’s future. A delusion is defined by Webster’s Dictionary as a false belief regarding the self or persons, or objects, that persists despite the facts, and one of the most prevalent and hard-hitting delusions that have prevailed in the late 20th and early 21st Centuries is the extremely fallacious belief by millions of rank-and-file human beings around the world, especially in the USA, that computer Internet educational pursuits produce as much academic learning for a person as does traditional classroom instruction. As there have been for decades of time, there are currently many recalcitrant adolescent public school students who greatly dislike the free structured schooling that they are required to attend in classrooms for twelve years in order to attain basic academic skills and a high school diploma. These young misguided men and women account for approximately 67 percent of all public school students, and, in most cases merely occupy classroom seats, with their minds absently elsewhere, during their elementary, middle school, and high school years and end-up barely attaining the minimum grades necessary for high school graduation. The real sad fact is that, for the American public schools to retain some delusive credibility in properly educating the bulk of America’s youth, around 70 percent of that 67 percent of all public school students have their grades pragmatically padded with huge disproportionate academic curves in order to make it seem that most of the American youth leaving high school at eighteen years of age are basically educated and ready to, either, enter the workforce or attend college. Yet, these basically uneducated, barely literate men and women leave public high school, and currently end-up, within three-or-more years, enlisting in the military, attending junior college or trade school, apprenticing for a trade, continuing to live at home off their parents, or becoming mendicants on the streets. Every year thousands of these millions of young people, fifteen to eighteen years of age, run away from home to end-up spending five-to-ten years on the streets, many of them turning to crime, before they realize the time and the precious free resources that they have wasted through contrariness and indolence.

Since around 1995, a great many of these millions of poorly educated young adults, eighteen to thirty years of age, have sought to bypass the need for hard work, and have been given the grand delusion that they can accomplish with a personal computer, alone at home for thousands of dollars, what they refused to accomplish during the twelve years of a free public education they were offered as teenagers. What do I mean by this? Seventy years ago, most graduates of public high schools actually graduated on a real eleventh-to-twelfth grade level and were prepared to, either, enter a college or university and perform real college-level work, or to enter a salable trade. As proper child-rearing in American homes (parents helping and encouraging their children to succeed in the public schools) became, over the decades after 1950, more of a burden than a privilege and responsibility for husbands and wives, who were more goal seekers than they were fathers and mothers, the male and female children of these very egoistic men and women were essentially left alone in the home to struggle academically by themselves during their formative and adolescent years. As a result, what used to be real high school diplomas conferred upon most eighteen year old graduates of public schools became no better than certificates showing merely 12 years of attendance, while junior college degrees (A.A.s and A.S.s) became certifications of remediation for high school deficiency. This process of remediation merely indicate that the students had compensated for their lack of academic attainment during their high school years at community and junior colleges during two years of study. Hence, as logically follows, traditional baccalaureate degrees now conferred upon senior college graduates, who matriculate from community and junior colleges, are hardly equivalent, to any degree conferred upon university graduates during the 1950s, ’60s, and ’70s.

Now we arrive at the crux of the issue at hand, the attainment of B.A., B.S., M.A., M.S., and, even, PhD degrees by these under-educated students from colleges and universities offering complete online Internet curriculum programs leading to conferral of these degrees. What happens when under-educated men and women, who graduated high school on probably a ninth-to-tenth grade level, attempt to do real university-level academic work five-to-ten years after they leave the public schools? Now remember that a high percentage of these individuals have spent time in the U.S. military taking military enlisted courses taught on an eighth-grade level and are told by these universities that, if they enroll in particular online degree programs and pay the required tuition, they will be given college-credit for military courses and for “life experience (whatever that means)” that will lead to the total 120 hours of college credit necessary for a baccalaureate degree. Moreover, a great many of these under-educated adults, 25-to-35 years of age, begin their so-called college educations online without any previous junior college remedial study.

So, have you, yet, figured out the dismal result of the grand delusion? These millions of under-educated students, who have anxiously embraced the computer-age, are actually made to believe that they can use the Internet, at home alone, to study the books and course materials provided by online universities and colleges, without the presence of an instructor/professor, in order to learn the equivalent of what is taught during four years of classroom instruction at traditional brick-and-mortar universities. What this was called in the 1950s, ’60s, and ’70s was correspondence/distance learning effect, which was not approved as equivalent to college classroom instruction by regional accrediting commissions. Presently, 98 percent of all Internet online college degree programs offered by most accredited universities and colleges are not interactive; that is, they do not provide video-teleconferencing for designated weekly lesson periods where the individual students are connected together to allow every student enrolled in the particular course to see his, or her, classmates, and the instructor/professor, on a computer screen during the lesson period, and to interact with each other during the class. As compared to the tuition cost of a three-unit undergraduate classroom course in American history, at the University of Maryland, which is around $500, the cost of an interactive online Internet course is about $700, and, invariably, the Socratic method cannot be effectively utilized by the instructor during this very expensive electronic interaction.

Most online undergraduate and graduate courses are, however, “not” interactive to any degree, and the only means for a student to communicate with an instructor, or other classmates, during the semester or quarter course period is by email, and that is regarded by most rational people as an extremely impersonal and disadvantageous means of effective communication. Let’s say the under-educated undergraduate student lives in South Carolina and is enrolled in an undergraduate online degree program at the University of Maryland. The student has all of the course textbooks and study materials, for a semester, mailed to his residence and he, or she, is allowed to perform the prescribed lesson assignments whenever convenient. There are no verbal lectures unless the instructor records them and allows the students to access them, along with the other course materials, using “Blackboard” software. If this is the case, the tuition for the course is substantially increased. Now, believe it or not, the instructor may actually live in another distant state, such as Minnesota, and a student may be unable to contact the instructor by email for extended periods of time. Hence, the under-educated undergraduate student is essentially left alone for most of the semester or quarter to study the course materials alone, and to take un-proctored, open-book, multiple choice question tests for grades, when the student’s academic honesty is not even questioned.

During the 19th and 20th Centuries, this type of learning was called the Lincoln-effect, which was named for the way Abraham Lincoln supposedly learned to be a lawyer, and was called then by most colleges and universities as a poor way to learn for the average student. Lincoln learned on his own by reading and studying what he needed to know in order to succeed in his legal endeavors, and his success was attributable to the fact that he was an extremely intelligent and intuitive person, capable of learning on his own, which the great majority of all public high school graduates are unable to do. Even today, a college or university will “not” give a person credit for learning independently, and actually knowing and mastering college-level course material before enrolling in a university and paying for the course. Then, even after a “very smart” person pays the costly tuition for the course and the professor allows the aspiring individual to take the course’s comprehensive final examination, the examination is, in most cases, not the regular final examination taken by classroom students, but one that has been made inexorably more difficult for the express purpose of ensuring that the very smart person does not make a passing grade. Does this sound unfair and sorely inequitable? Yes it does, because it is! The current academic system is staggeringly unfair to, both, the very intelligent and the very under-educated. The startling reality is that nearly all of the colleges and universities in the USA are much more concerned with advanced learning as a profitable money-making business than what it should be, the scared responsibility of helping intuitive and intelligently capable men and women, who are prepared for college-level work, to attain the learning and research skills that they need to succeed in opening new frontiers of the natural and physical sciences, mathematics, humanities, and literature. The sad fact is that baccalaureate and graduate degrees are being awarded every year to under-educated men and women who have completed undergraduate and graduate online Internet degree programs that are, in no way, equivalent to the degrees attained through classroom work under the close supervision of professors and instructors.

This particular grand delusion’s grave and deprecating effect, which I have endeavored to explicate in this essay, is, simply, that these men and women who have attained these online pseudo-degrees actually believe that they are as educated, intuitive, and intelligent as other men and women who have attended traditional colleges and universities to attain their undergraduate and graduate degrees. It is like comparing an online University of Phoenix baccalaureate degree in economics to a B.A. degree in economics obtained through continuous classroom study at the University of Texas at Austin, or at any other tradition accredited brick and mortar institute of high learning. The two degrees are basically incomparable. Yet, the majority of the American people of the 21st Century, 25-to-40 years of age, who have actually been conditioned to believe that obtaining college degrees quickly through superficial and watered-down online study is entirely equal to the painstaking process of obtaining a four-year baccalaureate degree through continual classroom attendance, have contributed greatly, by participation, to the educational diminution of the American republic, to its relegation to the status of a third-world nation. America now ranks 38th in the world in educational achievement. Can you imagine that, when, from 1945 until around 1970, the USA ranked first among all nations in population literacy, educational superiority, and scientific achievement?

As to the origin and advancement of this grand delusion, the reader is owed an explanation. How could this progressive and aberrant mind-set about the fundamentals of advanced learning have become so destructively prevalent in the latter-part of the 20th Century by sheer accident, or how could it have been widely accepted by the people as a standard model of educational endeavor through the visible efforts of one great man or woman? These two foregoing accepted explanations for the cause of historical events, accident and “the great man” hardly explain the subtle, publicly unnoticeable, events that have occurred from the late 19th Century through the mid-and-late 20th Century, which, working collectively, have caused deliberate systematic change in the way Americans are educated. The “accidental,” and “great man” explanations for the occurrence of history don’t hardly explain the sad miserable events that has plagued human beings from the outset of recorded history. The third accepted explanation accepted by contemporary historians for sad history, conspiracy, is the most reasonable and plausible reason for the occurrence of subtle incremental events that have collectively combined over the decades to produce an effect such as the grand delusion about the proper methodology for American learning. When a thorough investigation of the facts reveals the motives of conspiring men and women over an extended period of time to cause a major shift in the presiding philosophy underlying the essential rudiments of public education, those facts can, either, be closely examined by the existing traditional and electronic media and accepted by the American public, or capriciously discounted by that same media and hidden from the public. Why would an objective and independent media hide such scurrilous facts from the public? A free and independent media would not do such a blasphemous thing, but a media bought and paid for by the powerful and wealthy men and women who have conspired together to bring about such a shift in philosophy would so such a thing quite capably.

As Thomas Jefferson stated in 1805, “I’d rather have newspapers without government, than a government without newspapers.” What he actually meant was that he would rather have newspapers willing to publish the facts and the truth in the absence of government than a government unwilling to allow newspapers to publish the truth about what the government is doing against the interest of the governed. The American Constitutional Framers worked together to produce a state that would serve the people, not a state to be served by a people indoctrinated by government to be subservient. The latter status, a state to be served by the people, was predicated upon a political philosophy called Hegelian “statism”. A free-thinking people, such as the original American population that ratified the U.S. Constitution in 1789, are very concerned about individual liberty. As Henry Ward Beecher succinctly stated, “Liberty is the soul’s right to breathe.” This goes along quite well with what Thomas Jefferson stated in 1779, during the American Revolution. He said, “I have sworn upon the altar of liberty eternal hostility against all tyranny over the mind of man.” These immortal words, among the many others he wrote, today grace Jefferson’s memorial in Washington, D.C. All of the Constitutional Framers, who had also signed the Declaration of Independence, realized that “as a man, or woman, thinks, so he, or she, is,” and that perception of reality is the means whereby the American people will choose who, and what, they are. This is why the Constitutional Framers wrote the preamble of the U.S. Constitution to express its explicit purpose, which is stated with the first eleven words of the last twenty-three words of the Preamble “to secure the blessings of liberty to ourselves and our posterity.” The Preamble didn’t say that purpose of the U.S. constitution was to “establish justice, provide for the common defense, and promote the general welfare.” No, those particular things were a means for implementing the ultimate purpose, which was, and is, to secure the blessings of liberty.” Some might argue that the constitution of the Soviet Union had established a form a justice, provided for a common defense of the Soviet people, and promoted a form of general welfare for the Soviet people. But there was no liberty for the Soviet people to determine their own destinies with their independent pursuits of happiness. No, a communist dictatorship does not secure the blessings of personal liberty to a governed people, but, rather, just the opposite, which is control over the minds and bodies of the people. It’s certainly strange that most federal and State politicians today don’t consider the Preamble to the U.S. Constitution an essential part of the Constitution; but it really is.

“Statism,” socialist fascist philosophy that the people of a nation-state are to be conditioned to serve the state, began in the new USA as a pragmatic sociopolitical ideology embraced by wealthy ideologues in several the New England States in the latter-part of the 19th Century. I know that that’s a long way to look backward on American history to collect the relevant and pertinent facts about what really happened, but those facts were duly recorded by historians, journalists, and ordinary Americans in the form of journals, diaries, books written by writers who had actually witnessed those facts being established, and newspaper articles documenting those facts. The five ‘Ws” and one “H” of historical research are the questions and inquiries that lead to a cogent explication of the issues. Who, What, Where, When, and Why, and, of course, How, constitute the basis for historical research and the answers to how, and why, sad events occurred. There have been wealthy powerful aristocratic people in the USA who, from the outset of the republic, did not, at all, like the idea of a common rabble of human beings, the rank-and-file American People, being allowed to choose democratically, by the vote, who would represent them in a bicameral Congress and legislate laws that would affect and diminish the power and wealth of those aristocrats. In effect, these ideological oligarchies, shadow governments within the State and federal governments, were comprised of super-wealthy people who feared freedom and liberty as a political means of making them less powerful and less wealthy. Hence, came the collective surreptitious efforts of these shadowy oligarchies to systematically control the minds of the population in order to secure their wealth and power. These wealthy, powerful, and pragmatic people, though actually very few in number, knew quite well that the proper education and intuitiveness of that common rabble, the great majority of the U.S. population, would cause that great cross-section of Americans to insightfully seek the passage of laws that would enhance the ability of the common People to eventually, through industry and entrepreneurship, compete with, and eventually overshadow, the controlling aristocratic power-brokers; such as common self-educated, and brightly intuitive individuals like Cyrus McCormick, Eli Whitney, Elias Howe, Thomas Edison, and Philo T. Farnsworth, the poor Idaho farm-boy who was invented television.

In a succinct cut to the chase, the ten decades of passed time that have elapsed in the 19th Century have brought to pass the subtle, and extremely detrimental increments of change to public education in the American republic. For example, the ability to read and understand the published written word was regarded by the honored Framers as the keystone to public awareness and understanding of current events in State and federal government in order to assure an intelligent and informed electorate. The basic methodology for teaching America’s youth began as phonics, which was considered by such Framers as Benjamin Franklin, Thomas Jefferson, and John Adams as the proper methodology for teaching children, and illiterate adults, how to read. That was way that they had learned to read, and Franklin, Jefferson, and Adams had used traditional phonics to teach their own children how to read effectively, and the methodology was used effectively in the first public schools established in America before, and after the American Revolution. The first public schools established in the new United States of American were locally controlled and had nothing at all to do with the federal, or State, government. The parents of the children hired teachers to teach reading skills in these original one-room schools were for children of all ages, and phonics, learning to identify words by their vowel and consonant sounds, was used to teach reading.

Yet, another methodology for reading was created around the year 1813 by a man named Thomas H. Gallaudet. Gallaudet created the “see-say” method of teaching deaf-mutes how to read, since abnormal people could not hear word-letter sounds and learn through the normal use of phonics. Then in 1835, Horace Mann, a college educated intellectual, who had, himself, learned to read phonetically, was instrumental in getting the “see-say” reading primer, “Mother’s Primer” established for use by all primary schools in the State of Massachusetts; but by 1843, the very normal and reasonable parents of Massachusetts rejected the “see-say” method and phonics was restored as the standard method for teaching all normal primary reading in the State of Massachusetts how to read. Yet, Thomas Gallaudet, his children, and grandchildren were all graduates of Yale University, as was Thomas Mann, and they were also members of a secret order that existed then at Yale, and still exists and flourishes in the 21st Century. This was, and is, the Secret Order of the Skull and Bones. In fact, Horace Mann was co-founder of Skull and Bones, and it is much more than a passing thought as to why Mann, who had learned to read using phonics, would have pushed and shoved to get the “see-say” reading methodology, originally designed for abnormal deaf-mutes, accepted as a reading methodology for normal primary-age children. Furthermore, the false propaganda disseminated about the, supposedly, successful use of the “see-say” methodology, from around 1853 to 1900, resulted in the adoption of “see-say” by the influential Columbia Teachers College and the Lincoln School, which propelled the thrust of the speciously new John Dewey-inspired system of education that was geared away from the fundamentals of learning towards, rather, preparing primary child to be subservient units in the organic society instead of intelligent and intuitive individuals who could read comprehensively and effectively. ‘See-say” was ideal for the proponents of Dewey. Since learning to read effectively was the primary key for unlocking a child’s ability to read to learn, the Dewey-system deliberately eviscerated the one essential key step in the learning process, which would ultimately culminate in producing an informed electorate. See-say” also appeared to be an easy way to learn to read, despite the recognized fact that learning to read well required personal discipline and hard work.

Hence, I sincerely believe that the rational and reasonable American reading this essay will be clearly able to cogently extrapolate the inexorable and egregious results of adopting a reading methodology system, “see-say,’ created for “abnormal” deaf-mutes, for systematic use by “all” of the public school districts in all of the States by 1920, in order to teach normal elementary school-age children how to read. It wasn’t adopted by accident or as a result of a grand gesture by a wise man or woman, but, rather, by conspiratorial means over a long period of time The leading educational “authorities” from 1900 to 1920, exclaimed by newspapers, magazines, and radio as “progressives,” in the likeness of Theodore Roosevelt and Woodrow Wilson, constituted a select group of, mostly, men who had been educated at Yale University and were members of Skull and Bones. For an ultimately conspiratorial reason, the fundamental wisdom of the Constitutional Framers, regarding the adoption and preservation of phonics, was devalued during this time, and most, that is over 70 percent, of the national electorate were made to believe that what these, supposedly, learned 20th Century men were spouting about educational learning standards for children was based upon truth. Therefore, what is extant today, a nation of dumbed-down adults, is a sad result of a conspiracy that worked its evil in increments over 150 years to the present day. “See-say” is still the predominant methodology for teaching reading in the federally approved “common-core” system of public education. Though there are many private and parochial schools that have continued to teach phonics in the 20th and 21st Centuries, the graduates of these schools make up a very small portion, less than 10 percent, of all the school children in the USA. Most of America’s children, more than 90 percent, are, and will remain to be, products of the public schools.

In conclusion, the reasonable person can clearly see the progression of ineffective educational standards in the current process of educating most of America’s children. You have the elementary schools, which don’t teach basic reading, writing, and arithmetic to properly prepare children for their middle-school learning experience, and the dumbed-down children that enter middle-school from elementary school aren’t properly prepared for the last three, or four, years of high school. Consequently, middle-school is really a remediation of elementary school, and high school is, in most cases, a remediation of what should have been learned in middle-school. Therefore, 98 percent of the 17-to-18 year old adolescents who receive high school diplomas, aren’t really receiving graduation certificates for the proper completion of twelve years of education, but, rather, for merely attending twelve years of classroom experience, and graduating on much less than a twelfth-grade level. Most seniors in public high schools are actually working on a 9th to 10th grade-level when they walk across the stage to be graduated. So, here we are back at the beginning of the time-frame when men and women, 25-to-35 years of age and the graduates of public schools, begin to think that online college and university degrees are “really” equivalent to degrees earned by classroom attendance in brick and mortar universities; and that what they couldn’t achieve in a classroom with their level of academic preparation could be achieved outside of a classroom, at home, before a computer screen. This is, and will remain to be, the mass grand delusion that is the nemesis of American educational superiority.

Article Source: http://EzineArticles.com/9681951