Categories
Uncategorized

How to Unplug Your Kids Despite Schools Pushing Tech with Common Core

Before my son started kindergarten in a public school in Boulder, Colo., in August, his teacher asked me to bring him in for an assessment. I expected this to be similar to what my daughter experienced when she started kindergarten three years ago — he’d meet his teacher, see his classroom, and then his teacher would ask him a few questions. She’d ask if anybody read to him at home, and see if he knew how to turn the pages of a book and hold it right side up.

On the day we were assigned to come to school, when the teachers separated the kids from the parents, handed us a stack of paperwork to fill out, and ushered the kids into a separate room, I thought that’s what was going on. But about 20 minutes later, the teacher came out and asked me, “Has Theo ever used a computer before?” She explained that the kids were in the computer lab, completing an assessment on the machines.

I told her Theo hadn’t used a mouse much, except for a few times at the library. In fact he hasn’t used any kind of computer much except for limited sessions playing games on the iPad. He’d certainly never sat for a full hour at a computer, as he would during this assessment. “You should know,” I said, “Theo is left-handed.”

When I said this, two other mothers looked up from the papers they were filling out, alarmed, and said that their kids were left-handed too.

“Oh, well that explains it!” the teacher said, and went back into the computer lab to switch the kids’ mice to the left-hand side.

The American Society of Pediatrics’ most recent guidelines for media use among children note that lots of kids are spending seven hours or more a day looking at a screen. They advise the creation of screen-free zones at home, media curfews at meals and before bedtime, and no screen time for kids under 2 years old (an admonition I dutifully followed — it was a happy day when I finally turned on “Sesame Street” for my kids). Kids over age 2 should have no more than one to two hours of screen time a day.

Meanwhile, public schools are facing pressure to prepare kids for nationwide tests of the Common Core standards, which begin in the 2014-2015 school year. Most tests for fourth graders and up will be computer-based and require facility with a computer mouse, and the literacy tests will include essays that the kids must type directly into the computer.

Schools are understandably nervous, as in some districts these test scores will determine whether a teacher gets a raise, keeps a job, or if a school is closed. So in my kids’ school and others, they’re ramping up computer lab time and encouraging kids to use literacy and math programs at home, despite the fact that some lack Internet access. According to a 2013 U.S. Census report, 25.6 percent of Americans don’t have home Internet access.

What is a mom to do when the test-focused technology policies of her child’s school conflict with health guidelines and with her own instincts and beliefs about the right way to raise her children?

The Wrong-Handed-Mouse Report

A few weeks after Theo’s wrong-handed-mouse kindergarten assessment, I met with his teacher for our first parent-teacher conference. Thankfully, she’d done old-fashioned one-on-one evaluations of the kids because this was the first year for the computer assessments and she wasn’t sure how accurate they would be. Still, she presented me with a 17-page printout of the results by a company called i-Ready. The printout includes four pages of advertisements for “Recommended Products from Curriculum Associates” — computer applications parents can buy to drill their kids on these tests, called Ready Common Core Reading Instruction. (The teacher apologized for the ads and suggested I ignore them.)

The wrong-handed-mouse assessment included all kinds of software-generated tips about how to improve skills my son knows perfectly well how to do — when interacting with a person rather than a screen. One of the i-Ready instructions: “Model how to blend onset and rime [sic] of spoken one-syllable words and finish by saying the whole word.”

If a human had assessed my child rather than a machine, I would have gleefully pointed out that they’d misspelled the word rhyme several times in this report. (I can’t resist being a pedant when it’s so much fun.) Or maybe they actually wanted to encourage me to engage my kindergartner in a discussion of hoarfrost? Or perhaps their archaic spelling is in homage to Coleridge? If the creators of this software can’t spell the word rhyme, how can I trust that their academic assessments are valid?

Maybe I should have begun to drill my son on these skills, hoarfrost identification aside, but instead I let the wrong-handed-mouse report drift under other papers, and continued to read to him before bed as I always have, rather than prop him before a computer to practice literacy and mouse-handling skills.

Briggs Gamblin, the director of Communications for Boulder Valley School District (BVSD), noted that this i-Ready assessment for kindergartners is recommended by the Colorado Department of Education. “For some students,” Gamblin wrote in an email, “the computer-based assessment was a new experience. Staff recognize this could impact individual scores.”

A Little of This, A Little of That

Most of what my son does at school seems like joyful, hands-on kindergarten as usual, but at home we are encouraged to have kids log in to a program called Raz-Kids, through which they listen to books and record themselves reading the stories. Every once in awhile we do this for a few days in a row, then the recording mechanism on the program malfunctions, and we forget about it for months.

Raz-Kids homepage.

Meanwhile, in third grade, as the Common Core standards test looms, my daughter has been drilling on typing and test-taking skills. The kids spend time in the computer lab or on computers in the library every day, working on programs such as one called Typing Pal that’s supposed to teach them to touch type fluently (as far as I can tell, painstaking hunting and pecking is still the norm, even after years with this program) and an online reading program called Reading Plus. Teachers can monitor how often students are logging in at home and how many lessons they’re completing.

I think the reading selections in Reading Plus are generally engaging and high-quality. I also think the Common Core test the kids will be taking, called PARCC, is fair and will do a good job of measuring critical thinking skills, judging from the samples I’ve reviewed. But what I don’t feel comfortable with is all the multiple-choice-question drilling the kids are doing to prepare for it.

Briggs Gamblin said, “BVSD does not have a policy restricting or requiring the amount of time children use computers and other electronic devices. However, BVSD practice is that computers and other electronic devices should be a natural part of instruction and used intentionally to enhance and aid instructional practices.”

My daughter does well in school, but at every parent teacher conference I’ve attended so far, the teacher gently suggests that she use the online resources the school makes available more often. I do, for a few weeks after the conference. Then I forget about it, or can’t make the time.

Never once have my kids asked if they can please, please sit at the computer and use Reading Plus or Raz-Kids. Instead, they ask me to please, please read them a book or play with them or take them to the park. Besides, with all the screen time they have at school, when they get home they’ve just about exhausted the American Society of Pediatrics’ recommended media limits for the day.

Looking for the Quick Fix: Job Skills for Kindergartners

I started learning to type in a summer class during middle school, but the lessons didn’t really take until high school, when I took a school typing course in which all of the students sat at typewriters, the instructor up front calling out, “A space. J space,” as we clattered along, pressing the corresponding keys. Typing fast and accurately is a skill I used in college and at every job I’ve held since then. But do we really need to make 6-year-olds focus on future job skills?

Some experts see no developmental problem with kids typing young, comparing it to piano playing, while others say kids’ hands are too small to span the keys, or worry about repetitive stress injuries. As of yet, there’s no scientific consensus on the right age to begin to learn to type.

Dr. Brian Volck, assistant professor of Pediatrics in the Division of Hospital Medicine at the Cincinnati Children’s Hospital Medical Center, said, “There’s insufficient data to make a definite recommendation. Some children may be ready at an earlier age than others, depending on such factors attention and manual dexterity.”

When I mentioned some of the uses of computers at my kids’ school, including the wrong-handed-mouse test, to Nicholas Carr, technology reporter and author of the Pulitzer Prize-finalist The Shallows: What the Internet is Doing to Our Brains and the new The Glass Cage: Automation and Us, about the misguided rise of automation in education and other sectors, he responded in an email, “This is an entirely wrong-headed approach and runs counter to pretty much everything we know about child development. But technology promises a quick fix, and at the moment, in education and elsewhere, that seems to be what we want.”

I’m all for teaching kids about technology, which will be a part of their personal and work lives forever. But shouldn’t they learn how to write software programs rather than how to scan a text and answer multiple-choice questions on a screen? Shouldn’t they learn about how to assemble computer hardware, build an object with a 3-D printer, or shoot and edit digital video footage rather than passively watch as a computer reads them a book? Many studies suggest that when people read on a screen rather than paper, they read less attentively and retain less. So why aren’t schools using computers for what these machines are actually good at instead?

My daughter tells me that when she and her friends finish their required computer lab activities, they explore. One kid figures out a new trick and teaches the others — how to modify the background of their desktop or play music while they work on Typing Pal, for example. My daughter enjoys learning from her friends and teaching them how to program cartoon cat videos in Scratch. This is the kind of learning about computers I feel is more valuable than screen-reading and quizzes — and the kids are teaching themselves.

When I asked Briggs Gamblin why the Boulder Valley School District doesn’t emphasize programming more, he said, “BVSD staff are in the early study of enhancing computer science instruction and activities in elementary and middle school curricular offerings. BVSD staff recognize there is student interest in programming and developing computer/technology skills as part of 21st Century learning.”

At home, parents get the final say on screen time

When my kids get home from school, they have about five hours before bedtime. Once they eat, do offline homework, practice sports, piano, or Lego building, eat again, and take a bath, that leaves us with about one precious hour. On most nights I choose to spend that hour reading stories to my kids and talking with them and having them read to me, rather than setting them up on the computer to practice literacy skills.

I don’t blame the teachers for having the kids practice test taking and typing — teachers are under a lot of pressure with the Common Core tests, and they are trying to make sure every kid is comfortable with computers. But at home, I can choose to unplug my kids.

I expect for as long as my kids are in school, teachers will continue to urge me to have them spend more time on the computer, and maybe my kids won’t do as well on the computer tests as their offline intelligence would suggest they should. But somehow, I think the kids will be all right.

 

Categories
Uncategorized

8 Technologies That Will Shape Future Classrooms

Classrooms

What does the future of learning hold? What will classrooms of the future be like? Emerging technologies such as cloud computingaugmented reality (AR) and 3D printing are paving the way for the future of education in ways we may have yet to see. At the very least though, we can extrapolate from what these promising technologies and predict how schools will adopt them in time to come.

However, just as the original intentions for new technology often give way to innovative and unpredictable usage, we can never be sure if a twist is waiting for these rising stars. As for now, let us observe their progress and speculate on how these 8 up-and-coming technologies could potentially change education for the better.

Recommended Reading: Major Tech In Education Trends In 2013 [Infographic]

1. Augmented Reality (AR)

We’re still waiting for Augmented Reality to take the world by storm by way of Google Glass, gaming and awesome apps for astronomy.

It’s expected to wow audiences with its AR capabilities, which allow users to see additional information layered over what they see through the lens. Currently, however, access to AR technology for educational purposes is mostly limited to smartphone apps.

Read Also: 5 Top Augmented Reality Apps For Education

Apps like Sky Map lets you scout the night sky for constellations, but they are not fully integrated as a component of education as they have yet to reach the stage of seamlessness. The AR experience must be immersive enough to blend information readily with the reality.

With Google Glass and the other AR-enabled wearable devices that will soon follow, students explore the world without having to hold up a device which could distract from the experience. Created by Will Powell, an AR developer for Oxford, a simpler version of the Google Glass showcases how effortless this can be. Check out this video to enter a world with seamlessly integrated augmented reality.

Read Also: How Augmented Reality Is The Next Big Social Experience

A New Way To Teach

Virtual field trips are also possible with AR. Physics teacher, Andrew Vanden Heuvel, taught from inside the Large Hadron Collider in Switzerland, streaming what he sees through a beta Google Glass to his students thousands of miles away. They see him, and he sees them; it’s as if they are in the same classroom! The “Hangout” feature in use here is particularly promising for team collaborations in projects and assignments.

In other cases, students may be able to see supplementary and interactive information appearing on historical artifacts for them to get to know more about its history, just like how this AR advertising app can recognize images in the real world and interact with them.

Read Also: A Geek’s Wishlist – 10 Things We Want To Do With Google Glass

2. 3D Printing

What’s a better present for your 10-year-old than a LEGO set? How about a 3D printer, one specifically for children? The 3D printer should really be a must-have in classrooms. Instead of being restricted to what they can play with, pupils in the classroom of the future can print out 3D models for various purposes, including show-and-tell.

Engineering students and teachers are prime examples of who could directly benefit from 3D printing technology. In Benilde-St. Margaret’s School in Minneapolis, the school’s Dimension BST 3D printer lets students create design prototypes.

The 3D printer produces working mini-models to test out engineering design principles, so students can perfect their design before making an actual prototype. Together with CAD (computer-aided design) modeling software, 3D printing allows these students to experiment freely with their designs without expending considerable costs and time.

Abstract Thought, Real-Life Models

As it will be for many other subjects that require some form of visualization, the decreasing cost of 3D printers means that more teachers will be able to reconstruct complex concept models to teach theoretical concepts. For instance, the concept of molecular structures and configurations may be hard to grasp, but by printing out physical versions of these structures, this can help students put a form on abstract thought, and aid in better understanding.

Read Also: 20 Amazing Creations You Can Make With 3D Printing

3. Cloud Computing

“My dog ate my homework” just won’t cut it with teachers in the near future. Cloud computing is buzzing these days and will most likely continue to change many aspects of our society, particularly education. In a bid to modernize education in China, the city of Zhuji in Zhejiang has installed more than 6,000 cloud computing terminal devices in 118 schools.

Read Also: 9 High-Tech Toys & Gadgets Designed For Kids

In the future classroom, students may just need an electronic device to access all their homework and all other learning resources in the Cloud. This means no more lugging heavy textbooks to school, and having constant access to your reading materials as long as you have an Internet connection.

Such convenience will provide students the freedom to work on their projects or homework anytime and anywhere. The digital library is accessible even when the campus library is not. In fact you can skip hitching a ride there, or to the bookstore or even to class (but being sick may no longer be an acceptable excuse to skip “attending” class from your bedroom).

An Online Learning Opportunity

Cloud computing seeks to virtualize the classroom. Schools can now leverage on cloud technology and set up online learning platforms for students to log on and attend classes in a virtual environment.

Take for example, the concept of cloud-based virtual learning environment (VLE), which allows students to access learning content and participate in discussions in forums. Assignments or even tests can also be easily disseminated to the class, minimizing the need for students to be physically present, but to encourage interaction and discussion, educators require another channel.

4. Online Social Networking

Numerous universities have already registered themselves with the online virtual world, Second Life to provide students with an online platform to socialize with each other. As a big part of the cloud platform, such social networks allows students to share their ideas freely, while teachers moderate.

This is a very empowering notion because it will imbue learners with a new perception – that learning is a personal responsibility and not that of the teacher’s.

For Homework… Discuss

Furthermore, this many-to-many interactive learning where ideas are allowed to flow freely will be more aligned with real-world scenarios where collaboration is usually the norm. Social networking tools can be incorporated to enhance collaboration and team-building initiatives.

Still, if there is a need, teachers, lecturers and professors can lend some guidance in the form of responses to forum queries or by uploading useful information to the cloud community instantaneously. Another benefit is that It also serves as a great feedback tool, to help improve the courseware. A social-based approach to education will seem more than relevant to students of the future.

5. Flexible Displays

Note-taking on memo pads is still very much alive during lectures although there may be a shift from paper to laptops, netbooks or tablets. As educational settings become more digitalized, how will the future classroom reconcile the differences between pen and paper versus keyboard and screen?

The answer might just be flexible OLED-based displays. Just like regular paper, these displays will be lightweight, flexible and extremely thin. This means we can roll them up into tubes or fold them like newspapers.

Read Also: Are Flexible Display Smartphones Here To Stay?

Paper-Thin Smartphones

Unlike regular paper however, these plastic e-papers are not only durable (“unbreakable” is the correct term), but also provides interactivity. With swipes, taps and pinching (maybe), these flexible paper-thin displays can take over paper-centric industries.

Feast your eyes on this paper-thin, A4-sized digital paper prototype by Sony which weighs only a mere 63g. Laptops and even smartphones can’t hold a candle to that kind of portability.

6. Biometrics: Eye Tracking

One technology that’s been gaining recognition is biometrics. Conventionally biometrics are associated with the security industry, as it uses what is unique to each one of us to authenticate our identity: fingerprints, facial recognition, iris patterns, voice. In terms of education, some schools are only using fingerprinting to prevent truancy and for borrowing books from their school library.

Read Also: A Look Into: Biometric Technology

However, eye-tracking can be helpful for instance, in providing invaluable feedback for teachers to understand how students absorb and understand the learning content. As a matter of fact, advertising research have been using eye-tracking technology to see how consumers respond to their ads and to determine what captures their attention.

Similarly, the same form of analysis can be conducted to ascertain course effectiveness or individual learning styles. Mirametrix is using its S2 Eye Tracker to assess how students learn by getting details of where they look during online learning sessions.

Cheaper alternatives are turning up in the form of Eye Tribe for Windows and Android, so it’s only a matter of time before this data is attainable by educators.

Read Also: 9 Minority-Report-Inspired Touchless Technology

The data may then be integrated with interactive adaptive learning systems in a manner that adjusts the content to best suit each student’s learning style. Alternatively, the eye movement patterns may also guide the delivery of the content, taking into account concepts students might have trouble understanding evident in the longer time they spend gazing at that particular section.

7. Multi-Touch LCD Screens

Over the past few decades, we’ve seen the transition from blackboard to whiteboard, to overhead projector and to video projector for computers in schools. If you’re guessing that the next in line will be something that is akin to our smartphones and tablets, you may be right. Specifically speaking, the next “board” is likely to be a giant touchscreen LCD screen which allows a greater amount of interactivity.

After all, we’re talking about a screen that will be attached to a computer capable of generating infinite combinations of images, sounds and videos, just like our smartphones. The major difference with this new “board” and our smart devices is that it will be capable of detecting multiple touch inputs from many students simultaneously.

LCD Touch boards

Instead of the traditional big board in front of the classroom, it will probably be just like the Samsung SUR40 for Microsoft Surface, a giant tablet with its LCD screen lying flat atop a table-like structure. Students will sit around the table tablet, swipe on the board to manipulate and drag images around the screen, or type notes with their onscreen keyboards.

Think of the possibilities if every pupil gets one of these desks. Along with the social networking feature, these multi-touch surfaces will also allow students to collaborate live with peers around the world by manipulating virtual objects in real-time. The Multi-touch project by SynergyNet in Durham University is a great existing example of how such technology can be used by school children.

8. Game-Based Learning

Growing up at a time when the world is connected by the internet, kids today seems to have very short attention spans. This is unsurprising, since their childhood revolves around YouTube, Facebook and smartphones that provide them with on-the-go 24-hours updates and the answers to all their queries through Google and Wikipedia.

To cater to such a fast-paced generation, schools will eventually abandon traditional teaching methods of rote learning to align themselves with the times. One great way to achieve that is to use what had always been considered as a major distraction to learning – video games.

Gaming For Grades

KinectEDucation provides a one-stop online community for interested educators and students who want to use Microsoft Kinect for learning purposes. As can be seen from their video, some of the best suggestions on how educators and students can benefit from the motion-sensing technology include enabling students to learn sign language and how to play the guitar by detecting their hand movements.

In another example, a professor from the University of Washington Bothell teaches mathematics to her class by giving them the first-hand experience of learning through their motions which are captured by Kinect. Along with successful devices like Wii Remote and PlayStation Move, the motion-sensing technology is believed to be able to provide the necessary level of interactivity for students to feel more engaged with learning.

Learning To Design Games

Another concept adopted by educators does not focus on the gameplay or interactivity; rather, it emphasizes on how learning the game design process can educate students. In Gamestar Mechanic, the idea is to impart students with basic game designing skills (without the complexity of programming) to create their own games and consequently help them develop broad skill sets such as language, systematic thinking, problem-solving (through simulation, trial-and-errors, etc), storytelling, art and many more.

School children from fourth to ninth grade learn how to design one by playing a game itself where they assume the role of a young aspiring game designer who’ll go through quests, missions, etc to be awarded with various Sprites to use in their Toolbox (an area for them to design their own games). This is not unlike the role-playing video games we see in today’s market.

This illustrates how educators are moving away from traditional classroom teaching to that of letting students have fun and learn while they play interactive games. It’s inevitable that students in the future who grow up with such technology will require much higher levels of fun and excitement before they see education as appealing and captivating.

Education Beyond the Classroom

In the future, education will no longer be restricted to formalized institutes like schools and classes. Using AR, cloud computing, online social networking and adaptive learning systems utilizing eye tracking technology, learning can take place outside the tradtional classroom.

Experimentations and mistakes will also be encouraged as simulations are made possible through 3D printing and game-based learning without actually incurring real-world consequences or costs. Chief among all, students will soon be imparted with the wisdom of seeing learning as not a chore, but as a critical and gratifying part of their life which requires their proactive involvement.