#Statistics#math#rstats Please read this if you are a California professor in a STEM field. 1/
The California Dept. of Education (CDE) is considering a radical redesign of K-12 math education, both in terms of curricula but also in terms of methods of instruction. 2/
Though intended to benefit underrepresented minority (URM) kids, many of us believe the new system would harm those children; it would also harm non-URMs who stay, and we believe many would leave, for private schools, tutoring academies etc. 3/
In terms of curriculum, the proposal would eliminate 8th grade algebra courses. HS calculus courses would be discouraged. 4/
It would also add a data science course for 11th grade, which sounds like a good idea but which IMO would be a disaster, as very few HS teachers would be qualified to offer it. 5/
My biggest concern, though, is the proposal's methods of instruction. Teachers would de-emphasize getting the right answer, and would not require students to show their work. Emphasize would be placed on collaborative group work, rather than individual efforts. 6/
In order to provide motivation for URMs, math problems would be "applied," meaning in an ideological sense, say "Mr. Salazar just lost his job, on which he had been paid $X..." I am all for motivation but I believe there would be less emphasis on math itself, harming the kids. 7/
Ever since high school, a passion for me has been helping improve life for URMs tinyurl.com/y7ouh7o6. WE SHOULD NOT BE EXPERIMENTING WITH THESE KIDS' LIVES. There are better ways, proven successes: Hire super teachers, at top dollar; work closely with the parents; etc. 8/
Even China, one of the most ideological countries in the world, does not insert ideology into math classes. That should tell us something. 9/
I urge CA STEM professors to sign a letter that will be delivered to Gov. Newsom, tinyurl.com/zdv3ww regardless of where you are on the political spectrum. Only an organized effort can have an impact. 10/10
Also, the CDE proposal, and its already-implemented form in SF, are viewed by some as fueled by anti-Asian sentiment (alluded to in one of the replies to my thread here). I explain this in part of tinyurl.com/3j77zcb2
#rstats for YOU: Ever encountered an R error message like "Couldn't create memory segment of size 3.2G"? Also, more subtly, ever had some code run slowly in spite of little apparent reason? And ever wonder why data.table is so blindingly fast? This post will be on MEMORY. 1/n
So here goes Memory 101A. Memory (meaning RAM) is broken down into "words," typically 8 bytes long. One R numeric quantity will occupy one word. So, e.g. 1 gig of memory will hold a numeric vector of length only 125 million. 2/n
Each word has an "address," an ID number. If your installation of R includes the tracemem() function, you can use it to determine where in memory your R object is, e.g.
> x <- 1:500000
> tracemem(x)
[1] "<0x7fb1eab6e010>"
3/n