2018 NAEP Technology and Engineering Literacy Results
Good. Afternoon thank you all for being here in. North, Carolina today it wasn't planned it wasn't my intention but they'd already picked the site before, I became chair, this. Is a world-class, Museum. Of Natural Sciences. Right, here in the capital, of North Carolina, I think it's a great location for our discussion, today given. The museum's, focus, on advancing. Science, in an, innovative, world, for. Those who don't know me and all of you out there in technology. Land doing this live stream, i'm bev Perdue I, have. The honor of the intere of the National, Assessment governing board and I, also have, the honor of serving this great state as the, first woman governor of North Carolina this. Has been my home for almost all my life so, I have seen it, really change and grow with all of the technology, now around us as you. Likely know the governing, board works with the National Center of, Educational. Statistics, to. Produce nape, the. National, Assessment, of Education. Progress and. We. Call it in political. Circles the. Nation's, report card and, you, really look forward to those scores, when they're released, providing. Data that informs, decisions. For, every state in the country about, how. To improve. The. Educational. Systems, we're. Here today to release and, to discuss, the results of this. 2018. National, Assessment, of Educational. Progress in. Technology. And engineering. Which. Is a relatively. New part. Of the assessment process it's. Really important, a lot of you know why but let me reiterate it, we're, at a crucial crucial, time, in the transformation, and, rebranding. Of our, educational, system, across America, we. Have a duty to ensure that, our nation's, educational. Assessments. Reflect. The knowledge, and the skills that our Cerie to, pave the road for rule kids our student, success as, citizens. And workers, in the 21st, century. Technology. And engineering skills are absolutely, essential, to those students, as they continue, their education. So. It's exciting to be here in the capital for this important, event not. Because we all love Raleigh but. Because this area has become one, of the most important. Places. In, the. World for, technology. And engineering, companies and startups. It's vibrant. Startup seeing, some of their staff found themselves downtown, last night I think, they might have had dinner there and I asked him if there were a lot of people out and they, were blown away at the number of folks on the streets in Raleigh so. It is really, a 21st. Century town a. Robust. Community of companies. Educators. Incubators. Advisers, and talent, our. Kate through, 12 schools, we believe are, leading. The way in teaching, kids when they're young how. To enjoy, as well as produce, in these skill sets and some. Of those kids in schools will be featured in today's program, I'm really, pleased at mark Johnson, who, is State. Superintendent, of Public Instruction, slipped. Away from a state board meeting to, come over here and I'm, equally, pleased that my friend Alan Duncan, is here too he's the vice co-chair, of the, State, Board and free, bus to come is really important, so thank you.
This. Afternoon, the plan is simple we're. Going to provide opportunities. After you hear the data for. Questions, from our live audience and. From those of you who, are out there in, ether. Land during. The Q&A period, if you're in this room just, raise your hand and there will be a roving, microphone. Going around click. The button on the page though, if you're out at home or in the office or, wherever you might be watching it via the computer and, submit. Your question, we'll, have someone, here to ask it for you and remember. This is q and an a the. Key stands for question, and without, the questions, we are dead cold in the water so, help, us by participating. We'll continue the conversation. After, we finished around four o'clock with. A reception, to which you're all invited on the, fourth floor also. For those in the room will, you please silence, your cellphone's if you, haven't done that already. Don't, hesitate them, don't, hesitate to use them, though with. The hashtag, te. L our, release. That's. So, you can tweet about what, you're doing today to. Get us started let's, watch a video the, video takes, us to. McClintock, Middle School in Charlotte, we're, going to hear from teachers and students. About what being literate, in, technology. And engineering, literacy, really. Means after the. Video Lin Woodworth who is the Commissioner of the National Center, for Educational. Statistics, will. Introduce another, North Carolinian, Peggy. Carr of the associate, commencement, commissioner, of, assessment. Division at the National, Center for Educational. Statistics they. Make up the test it doesn't do any good to be nice to her because, of the test just be nice to her because she's a bright wonderful. Woman, in a North Carolinian they. Will present the 2018. Til. Results. Thank, you. There, is a new type of literacy, coming it's. Technology, and, being. Able to function, in this technological, world. It's. Gonna be as important, as knowing how to write your name with the, way that technology is growing and it's like becoming, more relevant in our everyday lives I feel, like the, more important, that it is the, more that we're gonna need people who can fix it I can't think of a job and nowadays that the technology, is not a part of it and so, in middle school I think it's important, for them to have exposure even. If it's just an elective, just, different types of technology, so they can kind of get used to using it to do things. 1080. Is basically, a class about Nascars, and how they perform, and. It's. Like we're using RC, cars and, I'm working on them how to make them better like. Adding, a thin. To the back of it to make it more aerodynamic. Technology. Laters see for me is knowing, what's out there knowing, how to manipulate. Whatever, program, it might be and then. Also how to troubleshoot, lots. Of engineering, literacy has to do with like understanding, how things work what's the problem how, can I fix it that could be a human problem that could be a technology problem, that could be your car, anything. We. Originally wanted our app to do something different but, we didn't know how to do the code for it we were like okay since, we haven't worked on the code yet is there way that we could change our app and make it do something different I thought, 1080, was just a whole bunch of driving, and it's, taught me no math, formulas.
It. Also taught me how to speak, in front of people it. Taught me how to market, myself, it. Taught me how to be more efficient. And work as a team because, that's one thing that that. We say here is it's not really what, do you want to do for a living it's, what kind of problems do you want to solve because. If. We train kids like that wherever, they go they're, gonna have that skill, set of. Problem-solving. Which, is going to be really essential for the future you, hear all kinds of things where we're behind in science and math, and just different types of literacy math, literacy, problem-solving. Literacy, so how do we know where is the measurement of that you have to assess that to find out. Good, afternoon I'm Lynne Woodworth the Commissioner for the National Center of Education, Statistics known, as NCES. It's. Privilege to be with you here today to release a results from, the nape assessment, and technology, engineering and literacy, known as tell as, the. Education Statistics Center. For the u.s. government, NCS. Is the nation's source on data in education. In America, you may not know NCS by name but, you certainly are familiar with our work in addition, to producing large reports, on the state of education like, the annual report to Congress the. Condition, of education a report, on the indicators, of progress the United States and the, Compendium the digest of Education Statistics we. Produce many focused, reports such as reports on high school and college enrollment, dropout. In graduation rates reports. On school crime and safety and. Collect, and produce longitudinal, data sets that are used by researchers to show how students learn and develop through schooling, from, pre-school all the way through post-secondary training. We. Have also developed several online tools. Including. Our very popular college, navigator website. It's, one which helps students and parents look, at different characteristics at colleges to find the ones right for them we, also have a website. Called build a graph which, is used by classrooms, all across the country to learn how to do mathematics and graphing. Nape. Is one of the largest and most important, programs at NCES, and it's central to our congressional. Mandate, to measure education, progress, NCS. Is very proud to be responsible. For, administering and designing the assessments, and for reporting the results to the American public we. Are particularly proud of how NCS, and nape in particular, have, pioneered many, assessment. Innovations, that have improved, our understanding of what our students no one can do based. On the experience gained from nape NCS, is live collaboration around the world and we've, worked with different organizations. Develop the trends in international math, and science study the, programs, are International, Student Assessment and other, international, assessments that allow countries to benchmark their. Progress, against, each other and. Particularly. It. Excuse, me it is particularly appropriate that the new tell results would be released this year it. Was fifty years ago - exactly. Nearly, to the day that the first nape assessment, was given and it, was a what, we called an assessment, moonshot. In. Many ways Naples the very first large house guesstimate, of what American students and no one can do and Nate, tell assessment, continues, in the NCS tradition, of groundbreaking assessment.
Work A little. Later you'll have the opportunity to see some of the eighth graders demonstrate, some, of the state over there on a test from, the assessment. But. Now it is my pleasure to. Introduce dr. Peggy, Carr who will be presenting their full results from the assessment, Peggy. Is the associate commissioner for, assessment, in CS Perot she has held for nearly two decades, she's. Responsible, for in CSS portfolio, of national. And international large. Sell assessments. Including. The nape the. Program for international, student assessment trends. In international mathematics and. Science. Steady progress. In the international reading and literacy study and the program for international, assessments, of adult competencies. She. Is also notably as mentioned before a native North Carolinian, so we were walking in her home thank. You very much and now dr. Peggy Carr. Well. Thank you commissioner, that was really a great introduction. I'm. Actually very excited, to be here in Raleigh today, at this Museum, of Natural Sciences. To release, the results of the National Assessment, of Educational, Progress. Technology. And engineering literacy. We. Also call it tail. When. I was driving down yesterday, and I thought to myself hmm. This. Is perhaps no. Other place, better, than North, Carolina to release these results, because. As you all know we really, are first in flight. As. The Commissioner pointed out though I am. A North Carolinian, so, maybe I'm just a little biased. But. I would, imagine that we've all had similar moments, when we've reflected, upon the various, roles. Of Technology, and how it, has affected our lives. Flight. Electricity. Even, something like antibiotics. The. Smartphone's, that we all carry, around every, day in our pockets. Which. Are now millions, of times more powerful than, the computers, NASA, used to put humanity. On the moon in the. Same year, I might, note as the first. Nape assessment. Technology. Drives society. Even. And I walked to the office the, cars the, stoplights, even. The cell phones yes those cell phones in our hands, every day and in our pockets. Illustrate. How pervasive. Technology has. Become. Its. Shapes how we communicate. Travel. Learn. Meet. Knees fulfill. Desires and. Provide services. And yes. Build economies as. Society. Grows, technology. Becomes the root of critical. Challenges. That we must solve. How. Do we rebuild after natural, disasters we, need, to architects, and engineers and, city planners, working, and communicating together. What's. The best way to provide, clean, water to citizens, and yes cute, little kids like this. How. Can I share this a moment, with my family we, all do this all the time. When. Science. Social. Studies and, reading and, mathematics as, before us we see it as foundational. Skills for students and it helps us thrive. Personally. And professionally but. We have just seen and heard why till, skills. Are just as important. Understanding. How critical, these skills are to succeed. In the 21st century. The. National, Assessment governing board. Initiated. The tale enterprise, an. NCS. Designed. Built. And administered. An assessment, to measures students. Technology. And engineering literacy. Technology. Is, any modification. Of the natural, world done to fulfill human. Needs and wants and it. Can include anything from hammering. Nail. To. Yes those cell phones talking. And, making selfies. Engineering. Is, a systemic. And often, eartip, approach. To designing, objects. And processes, and systems, to meet human needs and, can, range from designing. - programming, robots. -. Ashleigh. Conducting, surgery and. Literacy. It is the application all of these knowledge and skills to, the everyday, world. These. Three, concepts, collectively. Together, combine. To define what we call tell. Above. All tell, is a literacy, assessment. And while. It is not a measure of proficiency, in a single, subject like math and, reading the.
Concepts, And skills assessed. Until are covered, across. Subjects. It. Is also not a test of specialized. Technology. Or engineering skills. This. Assessment, comes from practical. Everyday life. And students. Must use their technological. Skills and communication, skills. To. Troubleshoot, problems and. Understand, how systems interact. And think, critically about as, well, as explain. The. Consequences. Of their decisions, and their actions, as. I. Have mentioned, tell, it is cross curricula. Students. Learn about technology. As they interact with it inside. And outside of, the classroom. So. Let's, briefly discuss, the, framework, of which to tell assessment. Was built. The. Framework identifies, the. Understanding. And application of technology, principles. That, are important, to all students. And it. Establishes, three content. Areas and three practices. As two, underpinning. Of tale. So. Let's focus first on the, content. Areas. Technology. And society deals, with the effects that technology. Has on society and, environment as, well. As the ethical. Questions. That are, raised, by these, effects. Congress. As. Asked. Many times how. Can we solve problems and. So this Chicago, task that you see here is a good, example of, one, of those problems. It, requires, students, to develop an online exhibit. About the building, of a canal to solve, Chicago's, water. Pollution problems, in the, 1800's. Students. Consider, the needs of different stakeholders, and. Understand. They. Must do, both, the economic and, environmental impact. Of a technological. Change. Design. And systems. This. Particular, area, focuses, on the nature of technology. In the processes. Used. To develop, technologies. In. That famous iguana, home, task for example, students. Evaluate how, to fix, the habitat, of a classroom iguana, who. Seems stressed. By. Engineering, a better habitat. Students. Demonstrate, that their, skills, and troubleshooting. And design can make a difference, an. Information. And communication. Technology. The third coverage. Software, and systems, use, for, accessing. And, creating, and communicating, information, for. Instance, combining. Audio and, video, to, make, effective, presentations. Then. Dromeda, tasks being released today. Measures. This, particular, area. Students. Have to select, and correctly. Credit, the, use of copyrighted. Image. Of the, gala see on a website. Tail. Also has three practices. Which. Thread, throughout all three of these content. Areas. They. Measure ways of thinking and reasoning about. Approaching. A problem they. Include, understanding, technological.
Principles. Developing. Solutions, and achieving, goals and. Communicating. And collaborating. Now. Let's move on to the results for the twenty eighteen assessment. From. January, through March 2018. 15,000. Eighth grade students, in about 600 public. And private, schools, across. The nation took, this incredible. Assessment. Tale. Is scored, on a three hundred point scale with higher scores indicating, greater proficiency. We. Report, average scale, scores for. The overall assessment, and for, each of the three content, areas and, each, of the three practices. We. Also report. These scores for selected, student, groups. Additionally. Nape. Reports. Results for, achievement. Levels developed, by the National, Assessment governing board, they. Said, standards, for what students, should know, and be able to do. This. Tell assessment. Was administered on, laptops, both in 2014. And. 2018. This. Sessemann includes, multimedia. Scenario-based, tasks. And sometimes we'll slip and say SPG's. An. Interactive. Stand-alone. Questions. And. Family.we. Surveys, students. And school administrators. About learning opportunities. And Technology. And engineering. We. Also survey. Students, about their activities. Inside. And outside, of. School. Now. Let's, take a look at some of the 2018. Highlights from, tell, our. Key finding, when, average students, scored higher until, than. They did in, 2014. First. Let. Me explain when an asterisk, means it indicates, a statistically, significant, difference from, 2018. Here. You see the average overall, tale scores, for eighth graders was. 152. And. 2018. Compared, to 150, and. 2014. The. 152, is just below, than a proficiency. Cut point of 158. This is where a student demonstrates. Solid, mastery, over challenging, subject. Matter in this, case tale. Subject, matter. In. 2014. The average score for each of the three content, areas was 150. Consistent. With the overall tale, increase. Eighth-grade. Students, scored higher in all, three, tale content. Areas in, 2018. For. Technology. And society, which. Is shown here in, dark, blue, the. Average, score, increased, by two points and. Scores. For design and systems in, orange. And for. Information, and communication technology. In green. Each, increased. By three, points. Our. Students. Scored higher in all, three, tale practices. As well, average. Scores for the practices, increased, by two points, for understanding, technological. Principles, in developing. Solutions and, achieving, goals and, by, three points, for communicating. And collaborating. Now. Let's dig a little further into the results, for our student. Groups, several. Student, groups scored higher in town, overall. The. First column of this table. Shows. A 2014. Score for each student group while. The second, column shows 228. Teen score, and. The. Score changes. Between the two years, are in the third in the middle column. Actually. The, column on the end the last column. The. Blue circle. Demonstrates, significant. Score increases. While the grey diamond, indicates. No, significant. Difference in average scores, we. See score increases, for white black, and Asian students, but. No other racial, ethnic groups show changes, since 2014. Despite. The. Difference. In scores that you may see those. Not statistically. Significant. Note. Both white, students, and Asian students scored at 160. And 2014. But. In 1820. 18, Asian. Students, became. The highest performing, racial ethnic group and surpassed. Surpassed. Their white counterparts by. Quite a bit. Now. Moving on to gender, the average scores for boys did not change, whereas. The gender. For. Girls the scores. For girls increase, from, 151. To, 155. As in. 2014. Girls outperform, their male counterparts. One. Of the big takeaways, in the data is. This gender difference and we're going to revisit that in just, a moment. Next. Let's, look at the differences, vice student. Disability, status about, 12% of students, in the nation or categorized. As students with disabilities, in 2018. Scores. For these students, did not change. But. Average scores for those students not identified, as students. With disabilities, increased, by 3 points. Next. We see tell overall, scores for English language, learner status which.
Includes, About 6% of students, in the nation the. Average scores for English, language learners did. Not change. But. Scores for students not identified, as English, language learners, increased. By 3 points. We. Will look next at two proxies. For social economic status. First. We see results for students eligible for the National School Lunch Program, or, in, SLP. About. 46%. Of students, in this country, were eligible, in. 2018. For, in. SLP. And, their. Average score increased, by 3 points. Scores. For students who were not eligible, showed. No change. Next. We see tail scores. Or parents, who. Have a higher. Level. Of education at, various levels here, and we see that the. Average score for students, whose parents who did not finish. High school increased by six, points and the, average score for students whose parents graduated. From college, increased. By three points. There. Were no significant. Score changes, for either school, location. Or, region. Of the country and so, to conclude, this section although. We see no difference is, over time for these and these important, variables, we did find differences, among, several, of the, groups that I've described, thus far. Another. Key finding from this assessment is that our middle and higher performing, students, may, gains Intel. Let. Me explain to you what I mean by that. Nape. Report, student, performance. Across the distribution using, percentiles. With. The 90th, and the 75th. Indicating. Higher performers. And the, 25th, and the 10th indicating. Lower. Performers. You. Will recall that the average score. Overall, went. Up. This. Increase, clearly, was driven by changes at the top of the distribution. Students. In the higher and middle, performing. Percentiles. Students. Who. Scored, at the 10th and a 25th, percentile. Did. Not improve. Over, time. This. Pattern was the same for all three tale, a Content, area shown, here you see the middle and higher, performers, making, gains. This. Pattern, is replicated. As repeated here, for, the practices. Improving. In scores for higher performing. Higher, performers, translated. Into larger. Percentage, of students. At or above the nape, proficiency. Level in the a advanced-level. There. Was a significant. Change. For, students. At the, proficient, a level. Of three points, and one percentage points, for advanced, but. Nothing, for students. At the nape basic. I. Promised. I would have returned to the gender story, and I'll do that now. Earlier. I know that how female, students, made, gains while, boys did not and we. Will now focus on, the gaps between boys, and girls. At. The left, side of this chart you'll find that 2014, results. And on the right side the results, for 2018. The. Gaps between males. And females are, shown in the middle column in. 2014. We found, gender. Gaps all. Over. Meaning. For the entire assessment and in, one of the three content, areas as the information, and communication. Technology. Area. These. Gaps carried, over into, 2018. Is shown here, but in addition note. That there was an additional gender, gap in technology. And, society. The. Large gaps, that you see here, for information, and communication, technology, are pervasive. When. You examine, the distribution. From the top to the bottom from, the. 10th through the, 90th, percentiles. Girls. Outperformed. Boys. The. Tale practices. Showed a similar pattern to. The content, areas, and 2014. There. Were gender gaps and two of the, three practices. And 2018.
There, Were gender gaps and all three. Note. That, the communicating. And collaborating. Practice. Had large, gender. Gaps as shown, here and again. The pattern of girls outperforming. Boys, could. Be found across the entire, distribution. From the ten through, the 90th, percentile. In. Addition. To test questions, students, and school administers. Administrators. Were surveyed. About learning, experiences. In and, out of school. Compared. To 2014. More students report taking at least one tale, related, course. 52. Percent versus, 57. Percent in, 2018. Students. Reported, taking one, or more tail related, course, scored, higher than those who. Did not. So. Let's see what kind of courses these students are taking, this. Chart shows that percentages. Of students, who took various, types of, Technology. And engineering classes. Here. We see increases. In engineering. And science and. Computer science I should say and other. Technology. Related classes like. Electronics. Let's. Turn our attention yet. Again to the gender story a. Larger. Percentage, of boys took, at least one class related. To technology in engineering, and 2018. Compared. To their female counterparts and. Male. Students. Were more likely to take courses, in three. Of the four categories as, shown here so for example. 30% of male students, and 21%. Of female students, took courses in engineering, but. Note there's, something important, here nearly, half, of the girls reported, not taking, any tale related, course, in, 2018. So. Let's, sum up to see where we are with this girl a buoyed story. For. Boys they. Did not improve, their performance overall. With. The exception, there were some exceptions at the advanced level where boys did improve but for the most part they. Did not improve over time when. It comes to course taking, more boys are reporting. Taking, at least, one, tale related, course, in 2018. But. They are still falling behind girls, in their performance. However. The, data suggests that there is a possible. Path, forward. For our boys. They. Need to significantly. Improve their communication, and, collaboration knowledge. And skill. What. About the girls, the. Results, show improvement, across the board they. Outperform. Boys and tell overall, in all. Of the three, practices, and two of the three content, areas and, regardless. Of their course taking, they outperformed. Boys. However. If, you, look closer the data show that there are still, miss. Opportunities for. Our girls. Almost. Half of the eighth graders, did not take a tale related, course, although these, courses, actually two higher scores. So. The data are rich the results, can be found online at the. Nation's report card gov. You. Can also explore, the scenario, based tasks, remember we call them SBTs, on site, and they are fascinating. Follow. Nape onto, Facebook, at the National. Assessment of Educational, Progress and, on Twitter at nape. Underscore. In CES. In. Conclusion. As, always, I have, to stop and thank the. Students, and the, teachers and the schools that participated, in the 2018, tale, assessment. And I would be remiss if I. Did not acknowledge, the. In, the, authors, of the report in, the audience, from ETS, Matt Berger just, raise your hand and the. NCS. Staff. Evany. A Walton, and a, Grady Wilburn of my, staff who made this, event that. Report, that your seat today, happen. Thank. You. So. Now I. Think my sign says, Q. And a and remember what the governor said. We. Have one question here. I'm. Laura, Bottomly from North Carolina State University. The, engineering, place for, k-12 outreach. Are. The results, going to be broken down by state or is that too granular. Well. We didn't do state assessment, I can tell you the governing board is looking very hard, at, adding. A, state. Assessment. For. Tail in the future but, right now this is just the. National, results, we have region of the country but. In the future I would say stay tuned. That. Question, here in the back oh. Hi. I'm Sarah Hannah Wald from the one-to-one Institute, and was. The, gender breakdown when. You looked at, boys. Who, took classes, and girls who took classes did that gender breakdown stole, true or what happened with boys, who did take classes and girls who didn't yeah, good, question. So. I'll, say this because I think you understand, this, vernacular. There, was a main effect for girls regardless, of what happened, girls. Did better than boys regardless, of whether they took horses or didn't take horses. Girls. Improved, when, they took course, these, types of courses but so did boys but. Generally, speaking girls. Just trumped everywhere. That's. The answer to your question I think. Good, afternoon, Jason painter director at the science house at NC State University I'm. Also interested in the gender breakdown. I'm questioned but you said that collaboration. And communication, is. What boys needed to improve, their. Test scores and that came from the data and, you talk more about why, improve, the improvement, of communication.
And Collaboration, will, help improve test, scores for. Boys and tell about. A third, of the assessment, how we score, the assessment. Was. In these particular areas of collaborating. And communicating. Both. In the area, of the content, and the practices, so it's an important, part, of how we put, together a total score voice. Did. Not do. Well in any of these areas where we're talking about the content, or how you apply, reasoning. And thinking and solving problems and, when. We look at the, entire distribution, for these two areas both the content, and the. Practices. It is pervasive as I indicated, before girls. Whether you're talking those at the 10th percentile, or if, the 90th, percentile, in this communication. And collaborating. Domain, scored. Higher than boys, so we can just get them to improve, in this area they're, going to do better on this assessment. Sorry, I. Am. Thomas Pheidon CEO Faton and also on best end seaboard. The, question, I have had I, have is, probably. Not, exactly to you two but in general. You. Know would. You also in the future be. Analyzing. How. Many of these. Girls actually go into, engineering. Careers. Which. I think we know, the answer to. Given, that companies. Like Google and Microsoft probably. Representative here as well are desperate, to hire Women, Engineers and, there's a big, shortage so. You. Know one. Is to answer that question and the second question would then be to, Mark. Johnson, more, you. Know how would how, could we translate, this, information, into. You. Know. Actionable. Items. Totally. Would. You also be able to study if there are any biases. Which prevent them from, continuing. Into, those kind of courses oh three. Good questions, let's see if I can remember them all so. No. We can't answer answer, the question, about. Longitudinally. What happens, to these students, it's a cross-sectional, study it's one shot in time and so we take a snapshot of, where students. Are at. 2014. And 2018, and, so, we're not following them but I can tell you the Commissioner is probably thinking that. We have data all over the place in the NCS, that that, gives us a hint about, stem. And. Course-taking. As. Well as, professions. Whether, students. Are girls, in particular going more. Into these subject. Areas so I would, say take. A look at our condition of there take a look at our digest there's, got to be a wealth of information there. And. I'm not going to remember the third question remind, me again, I don't have a pin up here.
Was. There any existing bias among, oh yeah, yeah, so, a good. Question, and I knew this, was going to be a concern. And of course our statisticians. On a cycle meant psychometricians. Did too so we did our due diligence. We. Did. What we called. Differential. Item functioning, analysis. Which we do for all groups to make sure the items aren't bias. When. You hold constant, the ability, of us of the, students, whether they're boys or girls or, blacks, or white students, do they do differently. Or perform differently, on a particular item they. Don't, we. Really aren't finding, those types of. Concerns. So we're, real comfortable, that there's no bias in what, we're seeing here today and there are other psychometric. More. Advanced, analysis. That are done but, we go through a lot of pains, to make sure that, there's no bias in these assessment, items but. Thank you for your question. Frank. Mckay with RTI International, I was curious about, the, significant, drop in scores for Native Americans, and Alaska Natives, any, insight, into, possible. Reasons for that yes. I know exactly what happened there, in. 2014. We. Had a, sampling, procedure different, from what we did in 2018. Actually the 2018, sample. Was. An improvement of how we sample. Native, American, Alaskan native students, in, fact we over sampled, them where we found them and in, doing so. Students. Who are in high, disa, D Native. American schools on the reservation. And so, forth those types of high density schools, they found their way. Statistically. And appropriately into the sample and so, the standard errors are very, very large and different it's a different sample, altogether I think. We have just. Time for a couple more questions but. Let me we answer, that the sample, for 2018, is an improvement over their, sample for 2014. So when we do that again let's see what we get. Perhaps. Another question, hi I'm, Lea bug I'm the executive. Director of the North Carolina science fair Foundation, and I was curious if there were questions on the survey, regarding. Out of school learning, opportunities. That the students, may have participated, in yes, I have a whole sheet I'm so, happy to share that. So. We. Asked students, a lot of interesting, questions for example we asked him do. You build or test a model to see if it solves problems, boys. Said, yes. 21%. At a time and girls said yes 14%. At a time. Figure. Out we, asked, them how something. Works and you tinker things take them apart put them back together boys, said yes I do that 40, percent of the time and girls said 30 said, yes 32. Percent, at a time and lastly. Again. Do you build models, and, test them and that sort of thing boys. 35%. And girls. 23%. So let me tell you what this means in terms of the findings that we're seeing when. Students, do these things they do better on this assessment than if they don't do these things that's. True of boys and girls but, girls aren't doing it as much that's, that missed opportunity. Point that I I talked, about later so the young, man in the back who, asked about how do we apply this this, is what you need to sort of think about these, missed opportunities for. Our girls thank. You for that question I. Have. A two, minute sign so that means I could answer, one more rich question. For you. The. Question back here. Name, topic why, do you think there's a gender gap Intel in favor of girls when it's the reverse in math and science at grade eight hmm, on neighbor yeah. It is and it's, also the reverse in math so, I can we can add to the context, of your question well. If the literacy assessment, is not an achievement, assessment. We, don't have this, is the one literacy, assessment, that we have innate and it's about the application. Of knowledge and skills to the real-world setting, that's, different, from. Taking, a, snapshot, of what students have learned in school there, are different constructs, so, that's the first thing you need to think about the.
Second Thing is that the. What. Is proficient, and, what. Is advanced, is decided. By a group of experts, so. You can't directly, compare how. Many students are proficient in math and science or. How, they score in math and science to, exactly, how they do in this type setting. Because the experts, have a different, context. For deciding, what is proficient, or what is advanced, I. Think. There was one other question the. Man. With the red tie. Thank. You Tom Miller NC, State University and. First. Of all I'm really excited. To see so many increases. Improvements. And the results. From 2014, to 2018 but, I'm actually wondering if, there were any specific. Programs, initiatives, that, could be pointed to that, might account for that, that, significant, increase. You. Know I don't know that but I do know that I got a call from the governor of I. Want, to say it was Minnesota. Wanted, to use this as, a. Way to evaluate, a, program they had going on up there so there's probably a lot of data that we. Are not privy to in terms of, applied. Application. In in, the, States I just simply don't have it here we should properly, as the. Council. Of great city schools and, others who track and serve these, data by the states, to see what we can find. But. It's a good question I think. That said I enjoyed my time here in. The museum it is absolutely. Beautiful and I could look, over there and see the beautiful fish as I. Gave. My talk today thank, you for having me. I. Have. Somebody's notes here let's see. Okay. All, right. Thank, you everyone. Tech. Support and. That happened, so thank. You everyone for joining. Us, and, of course much. Thanks, to Peggy, Carr for, an amazing introduction. To the experience, we had I will, acknowledge that. While Peggy, is looking at the fish. I am looking at the North Carolina 64. Carat emerald and. That has me inspired. For. This afternoon. My, name is Tanya, Matthews I am the vice chair for the National, Assessment, governing board, I myself, am, an engineer, scientist. And poet, as well, as a passionate, supporter of, science, technology engineering and math for. Everyone. Including. The non engineers, in the room we. Are very proud of the, technology, and engineering literacy. Assessment. One. Of our newest and most innovative. Assessments. As you've heard because. Of the style and, the approach and the type of test it is but. What we also want to know is what teachers and students, actually think about this assessment, as compared, to other assessments. And tests that they've seen that, they take and that they use and so. We, are always very conscious of that and asking, for that kind of information so. To find, out we've, asked. Teachers and students, here, in Guilford. And Wake, County, if they, would work with us work, through sample. Tell tasks. And come, talk about their experiences. With us here today and we, are very very, grateful that, they agreed, to take, the test now you know when students, agree, to take an additional assessment. After school, hours on their, own time we are not talking about your average young people, and so, we are very grateful as, well. As to their teachers who also put in the extra time to. Treat themselves to being a student once again first. Let, me give you a little bit of background about. The tasks, we talk specifically, about the, tests first. This, is a digitally. Based assessment. So our students are taking the task on laptops. Each. Task presents.
A Real-world, scenario. So, we use the phrase scenario. Based, testing. Students. Use an assortment of tools and, resources, to, apply their knowledge and skills and technology, and engineering and, what, you're going to hear from our students, is that this assessment. Is particularly, designed not around memorization. It's, not designed, around what you, know but, around how, you use. What, you know and part, of the scenario based, testing. Protocol. Is to give you the information that you need and, then, challenge you to use it to solve a problem one. Of our tasks, was, to help a fictional. Smith, Museum, which is apropos of our presentation. Space today to. Develop an online exhibit. About Chicago's. Water, pollution, problem, in the. Early 1800s, in a. Series, of steps students, learn about the problem, how, it was solved, by reversing. The flow of the, water through, canals and then, how this solution had, unintended. Consequences. Centuries. Later the. Students then figure out how to present this how, to sort through relevant, information, how to sort through relevant technology. And to think about the interaction, of Technology, and, society. Another. Task is to, fix the habitat, of a classroom iguana. Named, eggy. So, Iggy is not happy, clearly not thriving, in his cage and, students. Who were told about how the iguana, survives, in the wild, are asked. To think, through the design of the cage propose. Solutions that, would make it more comfortable, and also, how, to modify, his, environment, and how they would know when they were successful. So. With, that introduction. Joining. Us today we have for. A very generous. Panelist. We. Have, mr.. Luke wise check so. Mr., wise check is a mechatronics teacher, at Reedy Creek magnet, middle school in Wake, County, and then, we have one of his students, with us Nyhus like, who is an 8th grader at Reedy Creek. Nya. This is when you wave. And. Then, we also have. Michael. Rene at, the technology. Teacher at Mendenhall, middle school in Guilford, County and, Sydney. Bowen, an eighth grader at Mendenhall. Thank. You all so much, for joining us today and, as good panelists, they have brought a fan, club and, supporters. Some of their fellow students, Julian, Isaiah, Dylan, and Elijah have also joined us today in our audience as well, as a couple of the rock stars from the video that we played earlier today so we want to thank everyone for participating with. Us. So. I'm going to start, off with, Sydney. Are you ready Sydney. So. Sydney. You took the, iguana, task, and we were talking before and you said you didn't know anything about, iguanas. So, how could you take a task, problem. About, iguanas. Well. Earlier, are, the beginning of this house actually they showed us a video it, was really sure about how. Iguanas, live in the wild just said like they, get Sun from like the top of their back so whatever just, gave like a little information. About, iguana, in their wild habitats, so. Not. Information we were able to take that you're, like a little notes like they kept for, us that were like a summary of the entire video the facts they showed us you. Were able to keep that and use those facts to figure out the problem that was happening with the iguana and why it was so in happiness cage and. So what was different about the, way this, test went that was different than any other assessment you taken.
Well. They actually instead. Of having one stuff that we've like are known and learned in class they, gave us a problem, and they made us have make a solution, of it and. Did. You enjoy that, yes. I. Thank. You and so now I'm gonna actually give you the same question, because you had the Chicago. Water, task right and designing, an exhibit, for it and so. You. Also didn't know anything about Chicago. And. What they did and then you learned about this and you learned about some of the consequences, of, the, decision. How is this different from other tests you've taken it. Was different because like it showed us how it. Gave us like a little paragraph. With. A question, so. We could read that and answer, the question, and it had audio and, video so it was a lot more interactive which, actually, I prefer. When. It's more interactive instead, of me just sitting there reading it by, myself now. We mentioned this as a technology. And engineering literacy. Exam. But I noticed, that your, tasks had. A diary in, it and letters. So how does. A diary, and, letters help, you solve anything, about the tasks around reversing. The river it, shows some background knowledge about, how the polluted, river affected, the people and how they got sick, from drinking the, polluted water. And. Did you find that helpful, or different, I found. It helpful, because like it showed background, knowledge and stuff so, like. You got a better understanding of, how polluted, and how, dead, be basically, the water was. Now. Sort of kind of in a truth-telling. Kind of question to both of you you both noted, that the test is more interactive had. Audio and video, does. That make a difference do, you think that if a test is more interactive. You're. Engaging. With the test in a different way. Yeah. I think it does because. Instead. Of it kind of um, being. There and it's kind of like whenever like I take like a standardized, test like the EOG or the benchmark so I hit the take. Like. It, gave you it was like a little bit more interesting, and I was able to like keep up because sometimes like I'll spaced out during the test in like a I'm. Like bored and they'll rush the test we're. During that test, it was actually kind of interesting so it's also learning new things while. Doing the test and, so. That cutting me interactive, and active, during it, what. About you know. It. Kept me interested, like, Sydney said but, like when. I'm talking like the EOG I'll blank out during, the question, and I'll forget, what I read, so I'll have to go back and reread, it but. Instead of having to go back and reread it it had the paragraph, right above the question, so it was easier to just read, it, so. Part of it was the interaction, that kept you engaged, and. Focused and, part, of it was also the way it was organized, where the information, you needed was was right there for you so. Now. I'm gonna switch over to both. Of my educators. And so you of course also got to take the task and. Of course you've been talking, to your students, about it so mr., Rene I'm gonna start with you what, are your overall reactions. To, the tell as an assessment, how is it different than the, things we're normally using in our classes, these days sure, I, think, the biggest, thing that I saw that was really, interesting from the student perspective is, we did very little prep for this, intentionally. And. So I, had, the computer setup and when they walked around the screen they couldn't see in the screens when I was studying up and once I got them set up and they sat down and saw what they had to do I could. Just see they were you saw their pictures they were their eyes were like theirs, they thought they were gonna be taking a multiple, choice filling, a B C or D and so, I think for, some of them they were actually a little like whoa this is not what I'm used to so, I that was very right.
There In I'll be honest I kind of knew that was gonna happen but it was great to be validated, to see that that they were excited. About okay, I might know again like you said they came after school to take a test right but, they were genuinely. Interested in like oh this is not an assessment like I'm used to and so that, right there is getting, us halfway there where there's some buy-in now clearly. If you did that over and over and over you might lose that enthusiasm, but that, was the thing that stood out me stood. Out to me the. Most additionally. Just how. Performance-based. It was so, many of our assessments. Even in tech tech ed are, simple. Recall simple. Standardized. Tests where they just have to recall a definition. And regurgitate, it and obviously as an educator I want to try to move away from that as much as I can as, much as I'm allowed to and, so this type of assessment is where I would, like to see my students that as well thank, you mr., Weiss check what about your reactions, and thoughts yeah, I would. Echo a lot of what he. Said though I feel, a lot of the same sentiment, about how, it went a lot of the conversations, that we have when we're developing curriculum. And we're working with business partners and trying. To give our students a world, experience. How things work. I feel. Like this, performance. Task kind, of embodies, that you know in a way where. They have to get information. From, multiple, sources they, have to interpret it they have to put it together solve, a problem. That's. How the real world works that's, how people, that work in any business work that's how they work together with the team, it's. Not reading a big long paragraph, and see. And you spit. Out everything, that you just read it's. Being. Able to solve a problem with multiple pieces, and. I would say our, students, as well said. Oh this was a test they're like I'd rather take five hours of this than the two hours we spend on the reading EOG they're like that's the worst this is actually, interesting, even. They, said how it looked just, aesthetically. It. Was easier to look at there was a little more pleasant it's not just black and white. Though. Yeah I would share a lot of the same sentiments. Well. Now, in your early feedback both of you mentioned critical. Thinking and. Using data and we have employers. And. Workforce. Development folks, that are in the room with us and we talked about that, a lot a lot, and I'm, picking up from also. Talking with you you are training. And doing some of that in your classroom and also, in some of the clubs and activities that, our students are participating in, but. We struggle, to figure, out how to assess, that right, in in a meaningful way that gives teachers feedback, did. You think this kind of scenario, based, approach, is. Useful, in giving you that kind of information in terms of thinking through how to evaluate your students, critical thinking skills. Hi. One more time. Good. So. Can now I will try that over there we still got a green light here. Yeah, I would say that it does challenge. Their critical, thinking skills. It's. Like. Kind, of what I was saying before being, able to take.
Information In, different ways, and, put. That together to solve a bigger problem because I know we were speaking. Before kind of how, students. Think differently, and some of them look at the first. Problem with. The water crisis, about just reversing, the flow of the river and saying yep it's done that's it but, there's more to the problem than that and then being able to see the big picture persevere. See how all the different people were affected, through. Their stories. Historically. As. Well, seeing. The scientific, side I know. When we were speaking about it the students said well yeah I've used a lot of. We. Asked, specifically, okay what information, from other classes, did you use I mean they brought up science they brought up language arts they brought up some information, from social studies that they knew. So. It's interesting to see how they pulled all of their knowledge together. To. Solve a task that was supposed to be about technology. Thank. You that's actually another really good point in terms of interdisciplinary. Thinking and learning and helping our students put things together and. So now we're at the, point where we'd like to hear, some questions from the audience they, can be general, we, can direct. Them towards our students, I had pre warned them that some of these questions will begin with I'd like to ask the young ladies. And. As well as our teacher and of course those of you who are watching us online or, following, us on Twitter can, also send some questions in that direction as, well so. Now we've got a microphone, that is roaming, so. Who we, have a question way there. And in. The back behind you yes please. Introduce, yourself, with your question sure Juliet, mens from the University, of North Carolina Greensboro. We. Heard in the last presentation, that collaboration, was very important, how was collaboration. Assessed. In, this. And. What what how did the tasks, involve collaboration. So. I'm gonna start, by actually, asking. Naya. To. Talk a little bit about her. Tasks right because in your task part, of what you had to do was think about three different student, groups and how they were working together, like. With collaboration. You. Have to you take the stuff that since, there was three different teams, you, had to take what each team found and put it together to. Like. Figure, out the problem, and. Also. For. You, Sydney, in as you were doing the problem-solving. You. Were given, a scenario where different, students, were bringing different solutions, to the table and you had to choose between them, so how did you manage to do that so. From the information they gave us about the wild iguanas, I chose, what was best so I think one was like it was, there was like a stronger heat lamp or a. Plastic. A, plastic. Cage. Whatever. So. And. So. From the information that gave us from the video I. Implemented. That knowledge, and chose. A stronger, heat well that. Was like that, worked. And. So part of what they're describing this happening, in the assessments. Is in. Both of these scenarios the, students are given, disagreeing. Sometimes. Mutually, exclusive sometimes, collaborative. Suggestions. And thoughts and they're, asked to pick and to, mitigate and balance, those solutions, and give them the opportunity, to test so, that's one of the ways that, we are thinking, through the collaboration, skills even though the tech the tasks, are individually. Taken in that space that's a great question, other. Questions. If. Me up here in the front. Thank. You hi. Latonya, puts hello teacher advisor to Governor Cooper, I have a question for our teachers. First of all thank you for. What you're doing second of all having. Been exposed, to this assessment this. Change or have, you rethink, how. You work. In your classroom from. This point forward it sounds like you have clearly have these experiences. Already but, did this do anything to, make you rethink how. You could, alter, what you do additionally. To, make, sure that your students were successful, in this type of assessment.
Yeah. Absolutely like, like I said earlier I definitely, think this. Is the type of assessment and performance. Based, processes. That I wanted been, trying to implement for years in my classroom honestly and so the more we get this from outside organizations as well like NAEP, and stuff I think kind of validates I would. One of the things I thought about during the exam is you know how do you scale this that was something that was, definitely in the back of my mind. How. Do you scale it to you, know hundreds, of students and, to. The to the lady's, question about. Collaboration. You know I often thought to like what would be a way that we could get there was virtual collaborations, like we talked about but how could we get. In. Person collaboration, doing the same type of assessment, so but. Yes to answer your question absolutely it's affirmed. What I would. Have always known quite frankly but it's. Made, me want to do it even more. Yeah. And I would say a lot of the same things because, it was one of my reasons, going, into. The field of Education, something. That I struggled, with for a long time is not seeing value in what I did in the classroom. For. A long period of time so. Part. Of what we do when we're writing curriculum and changing what we do is how. Is this valuable, how's, this gonna give students, real-world skills, and experience. Where. They feel like they're. Getting something out of this they're treated, more like adults I mean yeah they might be 11, 12 13 years old but some. Of their ideas are, better, than the people that get paid to, do this so empowering. Them in that really getting them to think critically so, something, like this, task. Definitely. Embodies what we are trying to do and. I know we had a conversation about that and. And. I feel like this is definitely a step in the right direction of, how to assess students, how to think critically how to get. That collaboration, piece I know, they aren't allowed to share answers and things like that on standardized, tests but I think, we're going in that direction where, we can assess. Some real-world skills. That we, all use, every day. We, have time for one more question. Yes. Hi. I'm Carrie Brown Parker from NC State University and, I had a question for the students, I'm, curious about your, it, feels like this test assesses a lot of your ability to have.
Good Writing communication, and. I'm wondering how you felt. Like that was it assessing. Or testing, how you write as much as how you think, or was. There something to sort of help. You with that or how much of it really felt like it was about your written communication skills. Really. Good question. Okay. Basically. Like, after we finish, the test it actually had like a little place, under where it told us like our, like the actual answer to the question, that we took and like how they would like us to write that format, like. What we've been like would have been like parsley answered, parsley. Um. Given. Like credit, to full. Credit and what, I'm getting credit at all it's. Actually looking at the reading but written questions like we had to take and so. You, did have to have some writing skills like not to say like to. Actually state the question and your, answer. Like. I don't know like a good example of this but like just. Say like the question, like, the right way, like to say it cuz like it did say like parsley crotch would be like saying, like okay, and like Walla. Blob is like you were what to say. Well, wait. Like. The question like the answer, might. Be because, blah blah and then like yeah. Whatever, so. But. I think most students, I might take this would have a good grasp on that because we actually learn this in like ela, or English. In. Class and so I think any, student could do good on this for the rent assessment, I don't. Know the punctuation was actually credited and it or not because. The punctuation might be something you might need to work on. Like. Sydney was saying you, like use your English skills and to like. With. Your punctuation, like the question, is. What. Happened, in 1975. You can say in. 1975. Blah blah blah like, use that format, which, is like the written format that our English teachers, want us to use anytime we answer a question, in their class and that. Was like the full credit for the test assignment. And. I'm gonna applaud the students on their power of recall they took this question a month, and a half ago, sort. Of in terms of thinking through this and part, of picking. The answers and redesigning, the cage is also, a portion where they actually have to explain it they, actually have to write out the explaining, and both of the students, reminded. Me what ela stood for because they kept referencing it, in the way that they had to put things together with. The test and so, I want to thank, our teachers, and our students, for. For sharing their time with us and for, the bravery in. Getting up here with us and so you, know as we think about this exam and the implications, of it. Technology. And engineering literacy. Is. About how we use, technology and. Engineering, and. Evaluated. And understand, it it's not tech for tech sake it's tech, for Humanity's, sake and clearly. The. Young people and the teachers that, we're looking at today or how, that kind of society starts, and I think thinking through how to resource, our teachers, to, assess in this way and to have a national assessment that. Is just as good at men during content, as it is at measuring how our students, and young people are using that content, it's critical, as we, think about our, future, and theirs as well so thank you for sitting.
Through This segment and thank you for our students. What. A fantastic, panel. I'd, like to take another minute, to give a big round of applause to the students, and teachers, who, gave up an afternoon. After. School to, complete the TEL tasks. After all without students, and teachers our, purpose, would be meaningless. I'm. Linda, Rosen, the business, representative. On the, National, Assessment, governing board. Yes. I have a business background, largely. Focused, on stem learning, but, I started. My career as. A high school math teacher and, you really can never take the teacher out of the classroom. It. Does, my heart really. Wonderful. Feelings, to, be here, for today's incredible, news, we. Are discussing, results. That, make us proud of, ourselves, and, particularly. Of the young people, and the teachers who, gave those results, in the, TEL changes, between, 2014. And 2018. And, as. You're closing, panel. You're a wonderful, closing. Panel, we've, assembled a wonderful. Group of people from, a variety of, backgrounds who. Will share their reactions. To the, 2018, tel results, what, they mean and what we, can do next to continue, to produce a nation, of problem, solvers. Here. With me today is dr. Cory, Bennett the, director, of robotics, outreach, at North, Carolina A&T, State, University. Katrinka. McCallum. Vice. President, of products, and, technologies. Operations. At Red Hat a, multinational. Software. Company. Headquartered. Right here in Raleigh, and, Javaid. Siddiqui, the, president, and CEO, of, the hunt Institute. A k-12. Education. Policy, organization. Founded. By North, Carolina Governor. Jim. Hunt, also. In your materials. You'll, see the bio for, Sandra Cauffman the. Deputy director of earth science, at, NASA, she's. On the agenda but, as living, proof of the, fact that females, rock in, technology. And engineering, she. Could not be with us today at the last moment, because she was called to Cape Canaveral, to oversee. A rocket, launch. So. She sends, her apologies. But. She hopes you understand. There. Is more information, about each panelist. In your, packet, I have. The privilege of asking, some questions first, but, I promise. You will have time to ask your own questions, those, of you here in front of me and those of you watching, online. But. Most of all thank you for taking the time to be with us today, so. Let me start my first question, is for mr. Sadiki, as an. Expert, in k-12, education. What. Are your thoughts on how, when. And where. Students. Can engage with the kind of technology. And engineering, related. Content. That, tell covers. Lynda, thank you for the question yeah I want to first thank Nagbe, for inviting the hundreds to to participate and on behalf of governor hunt I know he would be remiss. Not for me to thank Governor Perdue for her leadership and bringing this conference to, Raleigh. Also. Another one of my bosses Sam, Houston I think about Sam's, work of the science math and technology. The, work he's doing with schools I think that's one, place that districts. And states can provide leadership I'm. Sorry that mark Johnson I know we sat for a long time I know Mark. Ran on this issue and, his one-to-one initiative, that he ran on, provided. A lot of leadership at the state level and a lot of states and a lot of districts are implementing that I think that's one thing the states have, to do more and I think it's responsibilities, to provide this leadership and. Our work at the Institute and our work across with, policymakers across the country we're seeing districts. And states that are really doing this work really well it's, because there's some type of dynamic, leadership or bold vision and I think that's, been, emulated, here in Raleigh Raleigh and North Carolina obviously from the great leadership the governor's office. Dating. Back to Governor hunt I've obviously had to say in governor Perdue's continued, leadership and, now, with Mark Johnson really running, on something like this one. Piece of data that one of my colleagues James. Pulled out in preparation, for this conversation was. 63%. Of eighth graders are building. Building. Things outside of schools, while. Along the 13 percent of eighth graders are building things inside, of schools and. I think that's really if you think about that that's a really compelling piece of data because, it suggests that schools, can. Be a vehicle for equity, because. Who are the kids when you think about a data we just unpacked earlier today from the, release. Of the report who, are the kids that are building, who, the 63%. They're. Not the kids probably coming from low-income families, coming. From students, of families, of color so. I think we're