Skip navigation
All Places > Welcome New Members > Blog
1 2 3 Previous Next

Welcome New Members

34 posts

As told by 2018 State Teachers of the Year


In this blog, the third in the series featuring conversations with 2018 State Teachers of the Year, we discussed whether teaching life skills had a place in the classroom. Teachers of the Year were asked to respond to this question:


Do you believe preparing students to be “life ready” is part of your teaching practice or responsibility?
100% of the teachers fervently said, "YES!"


Then we challenged with a second question, to think about what that looked like for today’s students:

Beyond academic subjects, what other skills will students need to be ready for life? And what needs to shift in K-12 to better prepare students for being life ready?

These info-graphics capture their thoughts.


Download info-graphic: Life-Readiness is More Than Academics 


Download info-graphic: Prepare Students for Being Life Ready 


What do you think is fundamentally key for our kids to be life ready in today’s world? And how does the school system, and you, the teacher, fit in to that? We would love to hear from you in the comments section!

Over the years, many teachers have asked questions like, “Which MAP Growth report should I start with?”, or “Which MAP Growth report is the most effective?” or even, “What order should I review reports in?”, and like all good questions, the correct answers to each of these questions is that it depends.


The best place to start with MAP Growth reports will depend on multiple factors, ranging from the time of the school year to the unique characteristics of the students themselves. So from this perspective, the question to start with is less about which report to review first, and more about what questions you’re using the data to answer. To help you navigate the different MAP Growth reports we’ve put together a list of five steps to discovering and using MAP Growth reports effectively.


  1. Verify that your account works and your students are rostered accurately. While this may sound like a basic first step, it’s one that’s crucial to get right—before you start looking for the right report, you need to be certain your account has access to the right data. Start by logging in with your credentials and making sure you see all of the students you expect to. (If you need a refresher on how to login, there’s a good one here.) If anything doesn’t look right to you, or if you need help getting access, contact your local MAP Growth Administrator.

  2. Determine which reports are most useful based on the time of year. Finding the “right” MAP Growth report will have a lot to do with the time of the school year. For example, many teachers use MAP Growth assessments in September to help each student set individual growth goals, and in those cases, the Student Goal Setting Worksheet is a great place to start. Conversely, after spring assessments, many teachers use the Class Breakdown by Projected Proficiency to see how their students have progressed as readers. For more information about how and when to use different MAP Growth reports, consult How NOT to get Overwhelmed by Data: Teacher Reports to use Throughout the Year, a year-round guide to MAP Growth reports from the 2016 Virginia State Teacher of the Year Natalie DiFusco-Funk.

  3. Get to know the 10 most popular MAP Growth teacher reports. Once you’ve spent some time determining which reports are most appropriate for your goals and the time of year, then consult our list of the Top 10 MAP Growth Reports for teachers. Where step two was about using data to answer your existing questions, this step is about exploring the other ways MAP Growth reports can help inform and augment your teaching in the classroom.

  4. Consult the NWEA Reports Finder to locate additional reports that may be helpful. Once you’re comfortable with the reports you’ve found so far, move on to the MAP Growth Reports Finder to review the complete list of MAP Growth reports—it even indicates whether a report shows data at the classroom level, the school level, or the district level. (Keep in mind that the reports you can see with your MAP Growth login are defined by your role and your students.)

    Bonus pro tip about the MAP Growth Reports Finder: if you copy and paste the table into Microsoft Excel or Google Sheets, you can sort and filter the reports, which can be helpful for seeing similar ones grouped together.

  5. Share your best practices with other teachers. Part of your MAP Growth reports journey will be finding the reports that are most relevant for your students and the specific needs of your school, so it’s important to share your insights with your colleagues to help them get as much out of MAP Growth as you do. Whether you’ve found a particularly relevant report, or you want to start a discussion with other educators about how you might apply specific report data, it’s important to have those conversations—building a data-driven culture requires collaboration!

No matter where you are in the school year, you can use this five-step process as a divining rod to ensure you’re getting the most relevant, practical data from all of the available reports. Try it with your next set of MAP Growth reports—and if you ever want to discuss your data with an NWEA expert, you can schedule a Professional Learning workshop.

Stop me if you’ve heard this one: I’m filling out my son’s next Ages & Stages Questionnaire (ASQ) that’s due when he turns three, and I’m freaking out, convinced that the responses I add today will tragically cripple the rest of his academic life and chances at future success. Naturally, I’m picturing his high school graduation speech, where he’ll go into great detail about all of the ways my parenting choices held him back, and he’ll pontificate on his own achievements, proud that he overcame all the setbacks my decisions created for him.


So I call the pediatrician’s office and ask about how to best help him prepare for the ASQ; after all, I want him to do well on it. And I can sense some hesitation from the nurse as she calmly explains to me that the ASQ isn’t something you prepare for, and that the purpose of a good growth measure is to help kids get the help they need, and not to help them pass, so it needs to be an accurate reflection of their abilities. She goes on to say that we’ll continue to use the ASQ as a growth measure to compare results, and eventually he’ll transition to a growth measure for older kids.


“Ideally,” she goes on, “You could use one measurement for all of his years in school, so you’d always be able to support specific needs over time, but we’re not quite there yet.”


I can’t help but smile, because I’ve been connected to NWEA and MAP Growth in various ways since 2005—and during my time working with partner school districts, I had similar conversations, trying to help others see that you can’t really study for a MAP Growth assessment. I laugh because when she wistfully said, ‘We’re not quite there yet,’ I wanted to correct her—”MAP Growth has an equal-interval scale, and there are assessments from K-12!” I wanted to tell her—but more to the point, I’m relieved.


Relieved, because up until that moment, I hadn’t really realized how grateful I am for MAP Growth and the educators who use it. Grateful for the people who live in the world of standardized milestones like 3rd-grade reading proficiency, and remain dedicated to not just teaching kids the same things year after year, but helping each of them where they need it. “Whatever happens,” I remind myself, “We’ll make sure to get him connected to MAP Growth, so I can help him with whatever he needs, regardless of grade level. Doesn’t matter if he struggles, or he’s a genius; the approach will be the same.”


I take a deep breath, and start answering the questions again. Is he able to put a shirt on by himself? Are you kidding me? He insists on doing it himself, and gets upset if we try to help him. Is he able to use safety scissors to cut construction paper? No, because we’re not monsters who give toddlers scissors to play with. Can he point to a picture of himself and say his own name? He’s a millennial; of course his selfie game is on point. Can he kick a ball? I wouldn’t know, because in our family, we realize that most sports are silly. Can he string beads? We’ve seen videos from daycare, so we know he can do it—but if either of us asks him to, he pretends like he’s never seen a bead before in his life.


I filled out the bubbles honestly.


That’s a big deal for me, because I’m no longer worried about what will happen as a result of my answers. He’s a kid, so that means he’s going to be ahead in some areas, behind in others, and my wife and I are going to help him grow at the pace he’s going to grow—because the timetable was never up to us in the first place.


That doesn’t mean that we’re not going to sweat the milestones, though. We’ll track all of the benchmarks, from 3rd-grade reading proficiency through the ACT or SAT (or whatever test kids are taking on SnapChat to get into college in the future). Standardized measures and benchmarks have their place, and they can be truly useful—and at the same time, the mindset has changed for both my wife and I. We’ll use the standardized measurements as signposts—but not finish lines.


Our mindset is now just to focus on helping him achieve his goals, whatever they may be. In the meantime, we’re just going to thank our stars that we live in an era where we can do that. And we’re going to thank our stars for the teachers who will use all the data they get—and I know, it’s a lot—to help our son grow.

We've worked with partner school districts from across the country with strong Science, Technology, Engineering, and Math (STEM) initiativesand some are even using MAP Growth Science to track growth across their programs. 


Whether you're using MAP Growth Science or MAP Growth Science for use with Next  Generation Science Standards, we want to hear about it!


What student population are you working with, and what have been the most powerful gains they've made?


How have MAP Growth Science reports helped inform your approach to your STEM? 

Nothing is more fundamental than reading—and when it comes to early learners, helping them learn to read is especially critical to their long-term academic success. Research in the field of early literacy is still evolving, but it’s consistently clear about one thing: in order to thrive and become strong readers, early learners need a support system that includes multiple sources. In other words, no single approach is good enough on its own, and the most effective young readers work toward their goals both at school and at home.


MAP Reading Fluency is designed to be one of the key data points that teachers use to help their early learners thrive; it’s built as a tool to support instructional decisions without eating up classroom time. With an objective, research-driven tool like MAP Reading Fluency, you can lead powerful conversations during parent-teacher conferences as you work together to build each child’s support system. Here’s how.


1. Administer MAP Reading Fluency to your early learners.

Everything starts with using a MAP Reading Fluency assessment; your student results will be one of the key data points you’ll provide to parents, and both the student’s responses and their audio recordings will help inform the choices you make for each individual child.


While giving a computer assessment to your youngest learners may sound daunting, we’ve built MAP Reading Fluency to be as streamlined as possible—so you can focus on maximizing classroom time. With MAP Reading Fluency, you can test the entire class at once, and get objective results with actionable data in about 20 minutes. The entire process is simple enough that you can periodically assess the students to track their progress. We generally recommend having students take MAP Reading Fluency assessments once at the beginning of the school-year, once before mid-year break, and once at the end of the school-year.


2. Share the audio review page and the Student Progress Report.

There are two main MAP Reading Fluency resources to rely on for parent-teacher conferences: the audio review page and the Student Progress Report. The audio review page has the actual audio recordings of each individual student’s responses, and the Student Progress Report breaks down how the student performed in individual areas like word recognition, listening comprehension, and even sentence reading fluency.


The Student Progress Report is ideal for leading a discussion around trends in the data—the student’s overall strength and challenges—and the audio recordings can be used as examples to support those trends. For example, if a child is struggling with initial sounds, you can show parents the score, alongside actual examples of the student’s responses that lead to the data.


3. Present assessment data alongside other data points.

While MAP Reading Fluency results are reliable and useful, they shouldn’t be used alone as the sole source of information about each student’s abilities. Instead, they should function as one star in a constellation of data points about each student.


Be sure to bring information to parent-teacher conferences that includes additional insights about their child—many teachers go so far as to create “conferring notes binders,” which help them catalog data points about each student like classroom observations, grades, or unique strengths and challenges.


4. Have a strengths-based conversation and discuss next steps with parents based on the data.

With all of your data in hand, including both MAP Reading Fluency results and your own insights and observations, it’s important to develop a narrative that identifies what each student knows currently and what they’re ready to learn next.

This is often referred to as a “strengths-based conversation,” because you’re framing the information in a way that both highlights and celebrates the student’s current achievements while also being clear about what challenges they’re ready to face. Ultimately, your work in this step will be to share your analysis of the student’s performance and potential in a way that is supported by data and easily understood.


During this step, it’s important to collaborate, and get parental input on the student’s narrative—they may be able to share their own experiences which can further inform your perspective.


5. Provide resources to help parents get involved with student goals at home.

The most effective parent-teacher conferences are the ones where parents and teachers develop partnerships to create a consistent support system for the student. As you’re preparing for conferences, consider adding supplemental materials to help them work with their child on the goals they’ve set at school. For example, many teachers include flash cards, reading lists, or specific tablet apps that target each student’s specific needs.


As you’re working together, make sure you’re making connections between what the data reflects and the strategies you’re recommending. For example, if a student is struggling in the area of Blending & Segmenting, you could share a word list with parents that includes specific vocabulary, and help them understand how using the word list can help their child improve in the specific area of Blending & Segmenting.


6. Set a follow-up date to check in on each student’s progress.

Because parent-teacher conferences are periodic, it can sometimes be a challenge to stay in sync as a team with parents. But if you’re planning on coordinating with them to make sure they’re working with their kids toward the same goals they set in the classroom, regular check-ins are essential.


Once you’ve reviewed all of the results together and agreed upon a strategy for helping the student improve, set a date where you’ll speak again and discuss any new developments or needed changes. At a minimum, that should be at the mid-year point of the school-year, but it’s also important to give the student enough time to make changes. Many parents opt to check in over email or through a phone call—check-ins don’t have to be formal, so long as you’re discussing what’s working, what can be done differently, and any new developments in the student’s life.



No matter what literacy challenges an early learner faces, supporting them effectively means making sure they have the opportunities to growth both in the classroom and at home. By using these steps, you’re creating a support system that’s backed by objective research—and also takes into account insights from parents and the classroom.


Special thanks to Amy Schmidt for her contributions to this  blog post.

In this post, we’re celebrating five Hispanic men who have changed history. If you’re looking for a fun way to engage students with Hispanic Heritage Month, consider exploring each of our honorees’ stories together. As a class, discuss how their achievements have changed our lives, and for each historical figure, have students consider questions like:

  • What do you admire most about each individual?
  • What challenges do you think she had to face in her career?
  • What issues or topics was each person most passionate about?


You can even add more names to the list if you want the exercise to last longer. Just don’t forget to come back and tell us how it went! Be sure to share your successes, challenges, and best practices.


Juan Felipe Herrera

Juan Felipe Herrera is an artist best known for his poetry—in fact, in 2015, he was named U.S. Poet Laureate. Herrera is often compared to the Beat Poets, due to his energetic, fluid style and his ability to cross artistic mediums: as a writer, poet, performance artist, and playwright, he’s given voice to the Hispanic community through his words. Most recently, Mr. Herrera has written a series of poems, each in response to a specific shooting or terrorist attack in the United States.


Octaviano Larrazolo

Octaviano Larrazolo became the first Mexican-American United States Senator in 1928. After starting a career as a teacher, he later became a principal, where he fought for civil rights and equality for Spanish-speaking students in education. He followed that passion for his entire career, eventually getting involved with politics: prior to becoming a Senator, he served as the Governor of New Mexico from 1919-1921. Throughout his career, he successfully advocated for Latinx rights on matters ranging from getting the New Mexico state government to recognize the Spanish language in public business to supporting the constitutional amendment for women’s suffrage.


Lin-Manuel Miranda

Lin-Manuel Miranda is a Latino composer and playwright famous for creating the Broadway musical Hamilton, and co-writing the songs featured in the Disney movie Moana. A native New Yorker born to Puerto Rican parents, Miranda has delighted audiences with compelling social messages and music that takes influence from a wide range of influences including hip-hop, Latin music, and musical theater. Most recently, he has been an active supporter of restoring Puerto Rico in the aftermath of Hurricane Maria.


Mario Molina

Mario Molina is a Mexican chemist who received the Nobel Prize in Chemistry in 1995 for his central role in identifying the ozone hole in the Antarctic, as well as humanity’s role in the threat of global warming. His contribution was so significant in shaping the conversation around climate change that he’s received countless other awards, including the Presidential Medal of Freedom in 2013. He currently works with teams around the world to investigate air quality issues, working to take our understanding of the environment even further.


Gabriel García Márquez

Gabriel García Marquez was an author and journalist, best known for his books Cien Años de Soledad (One Hundred Years of Solitude) and El Amor en Los Tiempos del Cólera (Love in the Time of Cholera). His books helped create the genre of books known as “magical realism,” where traditional stories are infused with elements of fantasy and magic. His work was so influential that in 1982, he was awarded the Nobel Prize in Literature.


Related blog postCelebrating Hispanic Heritage Month: 5 Latina Women Who Inspire Us 

October is National Literacy Awareness Month, and educators all over the country will be celebrating how critical reading is—especially in the information age. We’re eager to hear how you’ll be observing the month, so please share your experiences!


Literacy development takes many different forms. Sometimes it means guiding older students into a new novel where fascinating characters evolve, and plots thicken. For other kids, such as newbie kindergarteners, it means participating in their first “picture walk” activity with the book Zoom by Istvan Banyai. However, not all students engage with text in the same way. For some, engaging is more of a tactile and sound-based experience, where for others it may need to be a more visual and animated experience.


National Literacy Awareness Month is a good opportunity to make sure you’re empowering all your students in their literacy journeys. Just like engaging students with various forms of text, many districts will be implementing MAP Growth in various ways to support each of their students. MAP Growth was developed with accessibility in mind. That means various forms of engagement are possible: the assessments are aligned to the rigorous WCAG 2.1 standards, which give students the ability use their own tools such as screen readers, refreshable braille, and magnification. NWEA also follows standards and guidelines such as the CCSSO Accessibility Manual to ensure that students with various needs are not only getting the same experience as their counterparts—they’re getting a growth measure that adapts to both their abilities and their needs.


We’re passionate about creating engagement without barriers, and we love how Dr. Sarah McManus puts it: “For our students, equity really means that they can access information just as easily as sighted students. It doesn’t mean they are doing it the exact same way, but it means it’s just as easy…They don’t have to struggle with the technology side of it, they just have to focus on the questions themselves.”


We’re also excited to share that in November, we’ll be hosting the 2018 Accessibility Leadership Summit in Phoenix, Arizona: an all-day, interactive event with educators and industry leaders to discuss our collective future with accessibility, data literacy, and research. We encourage you to register today—and there’s still time to influence the agenda!


And don’t forget to share your success stories, lesson learned, and best ideas from your Literacy Awareness Month activities—we’re excited to see all of the ways you’re working to engage your students’ literacy needs!

Person A: “I’m worried about a student of mine.”


Person B: “Yeah? What’s the issue?”


Person A: “He’s smart: like, really smart, but he just isn’t following through. He seems…distracted. He always seems to be alone. I’ve tried to get him into games with the other kids, but it never seems to work.”


Person B: “I’ve got a student with a similar issue. She always has to be right. And, of course, this makes it hard for her to work with the other kids…”


So many different students. So many different backgrounds. So many different learning styles. But the same goal: academic and personal success.


Social and emotional learning (SEL) is a way of helping children navigate their lives by giving them the tools they need to manage emotions, establish positive relationships, and make responsible decisions.


SEL consists of five principal competencies: self-awareness, self-management, social awareness, relationship skills, and responsible decision-making (Collaborative for Academic, Social, and Emotional Learning (CASEL)). Together, these competencies have been known to improve academic achievement and make a positive impact on students that can last a lifetime.


Not surprisingly, SEL skills are vital to the formative assessment process as well. But it’s not that formative assessment—such as quizzes, exit slips and KWL charts—merely imply SEL skills. The formative assessment process actually helps to develop these principle SEL skills.


Process Makes Perfect


In terms of self-awareness, formative assessment requires students to evaluate their work, their strengths, and their weaknesses. In short: themselves. For example, a teacher can ask a student to highlight a section of their work that they are most pleased with and explain why. The teacher could also provide some prompts to get students thinking about their learning and identify opportunities for improvement.


Similarly, formative assessment requires a great deal of self-management to push students past merely doing what a teacher tells them to. In this way, they can not only lay out their own goals, but also meet them.

Then there’s social awareness—respecting their peers—which is crucial to setting the stage for student-to-student feedback, as well as relationship skills that help build a culture of collaborative learning in the classroom.

Lastly, there’s the competency of decision-making. Obviously, it’s important for students to make the best choices for their academic progress, and these choices are key to the formative assessment process as they contribute to the positive momentum of learning. For example, let students propose different ways of demonstrating what they’ve learned through presentations and multimedia, as opposed to the traditional essay. Or work with students to come up with relevant activities that better connect lessons with their unique goals and interests.


Be the Change


Modeling inclusivity is a great place to start when merging SEL skills with formative assessment. When you show that you honor everyone’s background and viewpoint enough to make these perspectives part of the learning experience, students observe the SEL principal competencies at work. This could be a simple as framing questions in a way where there are no right or wrong answers. For example, try not asking “what” and “when” so much as “why” and “what if?” Have students think about their answers before sharing them to instill more confidence and ownership in their responses. Or ask students to draw parallels to other events, literary selections, or—better yet— their own lives. 


Recording student responses without judgment is another useful strategy, as a teacher’s initial reactions to these responses can have a powerful impact on a student. When receiving an incorrect response, it can be hard to be patient, but try asking a follow up question or encouraging a student to try again. When you meet students where they are, it shows a dedication to equity that can draw out students who typically are either too shy or frustrated to participate. It’s the simple things that can really change the tone of a classroom, things like:


  • Building on the background of your students;


  • Asking questions that relate to learning targets and take the time after asking questions to allow for more student interaction;


  • Having students elaborate on their responses to deepen the discussion;


  • Systematically sampling these responses to further increase participation, and;


  • Recording student responses and share how students with different needs approach learning.


A Cycle of Success


Social-emotional learning can be a powerful tool for creating a culture of success in the classroom, and formative assessment only increases its strength. Together, they create a self-perpetuating cycle of critical thinking, conflict-resolution, and collaboration: skills that often can’t be conventionally measured but which are vital to a student’s education and emotional well-being.


And sure: it can’t be achieved in a day. But simply having the intention of supporting your students to develop social and emotional resilience will help them to forge better relationships with other students as well as with themselves.


Have you tried merging social and emotional learning with formative assessment in your class or school? If so, how? What techniques did you use (if any)?


Join the conversation and comment below. 

Each spring, many district administrators out there are tasked with reporting metrics to others in their communities – from parents to school boards – that answer the question, “How are the schools in our community doing?”


Analyzing and interpreting student data can be a time-consuming process, and it's hard to know what conclusions to draw. Researchers at NWEA recognized the need to make this annual data gathering and reporting easier. About two years ago, they created the Insights Report, a 15-page analysis of district data to offer insights into students’ academic achievement and growth. Dr. Andy Hegedus was one of the innovative NWEA researchers who created the report; we talked to him about what kind of value it provides to MAP Growth partners. Below, you can learn more about the Insights Report in our video overview, or read more of our interview with Dr. Andy Hegedus on Teach Learn Grow.


What 2-3 areas of the report do you typically emphasize when introducing it to a superintendent or an administrator?


1) The simple, straight-forward language and graphics that anyone can understand.

2) The one-page executive summary, which hits three key points – How are our kids doing? How are we doing on proficiency and college-readiness? How are we doing over time?

3) The report progresses from high-level all the way to an in-depth analysis of growth and achievement by gender and race. Superintendents told us they wanted help with the analysis of their data since we have the data expertise to do it and they typically don’t. Now we have a way to do that at scale.


How are current MAP Growth partners using the Insights Report?


They are providing the Insights Report to the school board – and to their teams – using the PowerPoint we provide as a starting point. (The report comes with a PowerPoint presentation, pdf, and an hour of consulting from NWEA.) By providing a written report in plain language, and a PowerPoint presentation, we intentionally made reporting district performance easier on administrators: We do the analysis for them. It saves them tons of time, and they can have faith in the numbers we provide. The Insights Report also provides third-party independence around the metrics. That’s important to some administrators and boards.


What was the most interesting question you got from an administrator about the data?

The most interesting question I had was from a high-performing, international school that questioned our analysis on college readiness benchmarks. Our data said that around 70% of their students were meeting college readiness benchmarks – but 100% of their graduating students attend college.


In that case, I showed the administrators their achievement percentiles by grade, and they had a bunch of early learners in the 20-30th percentiles in achievement. Now, every year, their achievement got higher, and in 10th grade, they were at the 80th percentile in achievement. So their students grew better than average, and their achievement percentile increased every year – and they left the school ready for college. But if you identified a student that would best represent the entire school, he or she would be a student somewhere in between – a student on the journey to be college ready; they weren’t all at the top, and they weren’t all at the bottom. We discussed that growth is what you keep your eye on. Growth is your leading indicator. If you keep growing students a lot, your achievement will improve over time. It’s through growth that all of the kids will become college and career ready. If growth drops for some, they might not make it.


Take a look: 


NWEA recently introduced some new metrics into our MAP Growth Reports that identify to what extent students rapidly guessed when they took their test, and the effect that rapid guessing had on their RIT scores. I described these metrics in some detail in this blog post, which also includes a link to some broader guidance we wrote about how to make use of these metrics when interpreting your students’ test scores.


While we tried to cover most of the questions around these metrics, there is one question we are getting from our partners that is causing a lot of confusion – “What does it mean when the estimated impact of disengagement on a student’s RIT score is….POSITIVE?”


Let me explain why you may occasionally see a positive impact by talking about a response to a single item. On that item, there are generally four response options – one correct answer, and three incorrect answers (or “distractors”). If a student provides a rapid guess to the item, what is the probability that the student is going to answer the item incorrectly? Given that three of the four response options are incorrect, if a student guesses on the item, there is about a 75% chance (3/4) that the student will get the item wrong, and conversely, about a 25% chance (1/4) that the student will answer the item correctly. So, when students rapidly guess, they have a higher likelihood of getting the item wrong, and when that occurs, there can be a subsequent negative impact on their score.


But what if the student randomly gets the item…correct? If a student guesses and gets the answer right, the test score can still be biased – however, in this case, the student’s score can be positively biased. That is, the estimated impact of disengagement on the student’s RIT score can result in the student’s score being higher than if the student had actually tried on the item.


Let’s expand this even further. Let’s say that a student rapidly guessed on 10% of items on a reading test – so 4 out of the 40 reading items were rapidly guessed. If we apply those some probabilities from before, we might expect the student to get 3 of those 4 rapidly guessed items wrong, and 1 of the 4 rapidly guessed items correct. In this situation, there would likely be a negative impact on the student’s RIT score (the score would be negative biased, and the impact would be a negative value, such as -1).


But what if this student was incredibly lucky, and managed to guess correctly on 3 of those 4 items? In that case, the student’s score would be improved because of guessing, not negatively affected by it (the score would be positively biased, and the impact would be a positive value, such as +1). And while this isn’t common (given the low probability of guessing correctly), that doesn’t mean it doesn’t happen. And across millions of test events, there are some students who get luckier than others – they guess correctly at a rate higher than we would expect.


So, what should you do when you see positive values? In this case, our guidance still applies – you should consider what percentage of items were rapidly guessed, and in turn, what the subsequent impact was on the student’s RIT score. If less than 10% of items were rapidly guessed, there likely won’t be a huge impact – positive or negative – on a student’s score. And, if the percentage exceeds 30%, that is a clear indicator that a student’s test score isn’t valid and the student should be considered for retesting – even if the impact of the student’s RIT score is positive!


The goal of this metric is to inform you when you should or should not have confidence that student RIT scores are true reflections of their achievement level. So, whether you see a -3 or a +3 on the impact of disengagement metric, those are both telling you the same thing – the student’s rapid guessing had an impact on his or her score, and consideration should be given around if that score should be used (or the student should be retested), and how to make sure the student stays engaged during future testing sessions.


About the Author

Dr. Nate Jensen

Nate Jensen is a Research Scientist at NWEA, where he specializes in the use of student testing data for accountability purposes. He has provided consultation and support to teachers, administrators, and policymakers across the country to help establish best practices around using student achievement and growth data in accountability systems. Nate holds a Ph.D. in Counselor Education from the University of Arkansas, an M.A. in Counseling Psychology from Framingham State University, and a B.S. in Psychology from South Dakota State University.

Thanks to all of you who attended our webinar, Get to know MAP Reading Fluency on February 7. We could not get to all your questions during the live webinar and wanted to share answers to the most frequently asked questions here.


Does MAP Reading Fluency work on iPads?


There is a dedicated app to deliver the MAP Reading Fluency student test on an iPad; however, it is currently in use for research purposes only. This app will be available for general use in the 2018-19 school year.



What type of headsets are required? Can you provide a recommended brand and approximate cost?


Over-ear headsets with a boom-style microphone and passive noise canceling are required to use MAP Reading Fluency. NWEA successfully used Avid brand models AE-36 and AE-39 in pilot testing. The two models vary in the input connection. AE-36 is an analog (3.5mm or aux) connection, and AE-39 is USB.


Preferred pricing is available to NWEA partners though Supply Master. The AE-36 model is $10 per unit and the AE-39 is $21. The USB connection provides a higher quality audio recording, which may improve ease of use of MAP Reading Fluency tests. USB headsets are recommended for desktop and laptop test delivery. Analog is required for iPad and recommended for Chromebooks.


How often can the test be given? Can MAP Reading Fluency be used as a universal screener and/or as a progress monitoring tool?


MAP Reading Fluency can be used in fall, winter, and spring as a universal screener or benchmark assessment. NWEA plans to support more frequent usage for progress monitoring in a future version of the test, pending the calibration of sufficient content.


Is there an extra cost to add MAP Reading Fluency?


Yes, MAP Reading Fluency is an additional piece of the MAP Suite that is available at an additional cost. However, special pricing is available for existing MAP Growth partners. Contact your account manager for more details.


Can the test be used with ELL students?


Absolutely! MAP Reading Fluency is well-suited to assess English learners because it isolates specific skills and dimensions of oral reading for which English learners may experience uneven progress. For example, some ELL students may struggle more with comprehension than their fluency level would suggest, and this is clearly identified in MAP Reading Fluency reports.


How does the test account for speech delays, accents, or other speech issues that might be found in K-3 students?


The settings or “strictness” of the speech-scoring engine have been tuned to a general population of K-3 students across the U.S., including those with regional accents, second language acquisition accents, and speech impairments. Developmental speech patterns and moderate accents are well-tolerated because the tuning has been set leniently. For students with pronounced articulation difficulties or strong accents, a higher rate of un-scorable audio may be observed. This can be addressed by using the audio review functionality to provide a score manually.


What norms are used for determining the grade-level expectations for Words Correct Per Minute (WCPM)?


Hasbrouck and Tindal norms for oral reading fluency are used to set the expectation levels for passage reading. Performance on other measures is classified as meeting, above, approaching, or below expectation based on judgement from curriculum experts and empirical data from field testing.


Can MAP Reading Fluency be used beyond grades K-3 for struggling readers?


At this time, MAP Reading Fluency is only appropriate for K-3 students. In a future version of the assessment, NWEA plans to introduce content, test logic, and an interface that are suited to older, struggling readers.


Is the instructional reading level aligned with leveled readers, such as Fountas and Pinnel, DRA, or others?


The instructional reading level is reported as a range, using a grade-equivalency scale. Using an instructional reading level chart, this grade-based value can be correlated with leveled readers, such as Fountas and Pinnel, DRA, and more.


Can schools get started yet this school year? How can we learn more, get a demo, or get started?


Yes! The Early Adopter program is accepting enrollments in schools and districts in the US throughout the remainder of the 2017-18 school year for those who want to start now. The winter test window is currently open, and the spring window will begin March 26th.


Contact your account manager to enroll, and to learn more or get a demo. If you are not sure who your account manager is, please visit, or call 1-866-654-3246.

The subject line on the email from my daughter’s third grade teacher read “Totally Baffled.”


It was mid-January, and my daughter’s class had taken the winter MAP Growth Math assessment that day. Her teacher was aware that I worked for NWEA and that I had used MAP Growth data in my own classroom when I was teaching. Her question was about an individual student whose score had dropped between fall and winter testing: my daughter.


My daughter’s score had gone down four RIT points, the email said. Four points is not too much greater than the standard error of measure, so I wasn’t all that alarmed. But I still wanted to know more about what happened.


I asked the same four questions I ask whenever I see that a student’s score has gone down:


  • Did she take much less time on the winter test than she did on the fall test?
  • Did she seem distracted? Was there anything going on that day or during testing that may have made it hard for her to do a good job?
  • How did she feel about how she had done?
  • Is she learning? Do you see evidence in class that she is making progress on the skills she needs?


The teacher assured me that my daughter was learning and that she had shown a lot of progress in class since the fall. She mentioned some specific concepts they’d been working on and some class assignments recently completed.

We decided that we would keep an eye on her but otherwise not worry too much about the drop and that we’d see in the spring if her score came back up.


Why did my daughter’s score go down? It’s hard to know for sure. However, now MAP Growth is providing even more information about student engagement during the test that could have helped answer this question.


The researchers at NWEA have done a lot of work around student engagement in assessment. As a result, this past fall the MAP Growth assessments featured exciting new tools that monitor the length of time it takes a student to respond to each test item and that compare the time to the average amount it has taken other students to respond to the same item. You can read more about these features on our Teach. Learn. Grow. blog here.


What kinds of questions do you think about when you see a winter score drop? Tell me about it in the comments below!

As educators, we constantly hear how important data collection is, but are often not given the tools for what to do with data. We need to change that! In this post, I’m tackling how data can be used to design small reading groups (guided reading) in K-2 classrooms. The steps below outline a repeatable framework that can be applied each time you collect data and regroup students according to their reading level.


Assess all students’ reading over the course of 1-5 days. Ideally, assessment occurs 3-5 times per year to provide actionable data. The rationale for testing your entire class over the course of 1-5 days is simply to ensure ALL data is collected within a manageable time frame. Time is a scarce resource for educators, so setting a concrete timeline helps to ensure all students’ reading is assessed. When I taught, I tested in September, December, February, April, and June, and created “Inquiry Week” mini-units (students voted on the unit topic). This provided new, exciting content for students to learn and allowed me to pause my guided reading instruction, so I could test everyone.


Assess multiple reading skills to build a full reader profile of each student. The testing process will look different depending on the grade level, but your overall assessment should include a consistent set of leveled texts that all students read (some read one, some read multiple, but the key is that the texts stay consistent regardless of the student). When reading a text, assess students on the following: concepts of print (Kindergarten only), accuracy, comprehension, rate, and fluency. Most assessments already contain these subtests, but if they don’t, create a quick template for your class so you have data in all the categories listed above.


Analyze the reading data on a class, group, and individual student level. This is the most crucial step in creating your small groups because this is where data becomes action.


  • Class-wide lens: Using your class list, enter each student’s score on all sub-tests to view the data from a class wide lens.
  • Group-wide lens: Using the above, at, and below benchmarks, create small reading groups of about six students each (educators with large class sizes can increase but not exceed eight per group). Students grouped together should be within 1-2 levels of each other to be most effective. As you create these small groups, make note of the most common concepts of print (Kindergarten only), accuracy, comprehension, rate, and fluency instructional needs for the group.
  • Individual-student lens: Once you have each student in a small group, scan the data for the instructional area that is the highest leverage for the student’s reading growth. A helpful question to ask yourself is, “What held this student back from reaching the next level?”


Create mini-instructional units for each small group. Mini-instructional units will guide your small group instruction over the next assessment period. Typically, mini-instructional units cover 4-6 weeks of learning. The timeframe gives students time to learn new skills, apply them in real time with your feedback, and make solid progress. This is where that common goal you set aside during the “group-wide lens” analysis is a big help! Take that goal and backwards plan 4-6 weekly objectives to guide students in meeting that goal. Now that your mini-unit has an instructional focus, drop in the relevant content standards and your daily objectives. To be even MORE precise, add in weekly phonics goals for each group – sometimes referred to as “word work.”


Share individual goals with students! By sharing individual learning goals with students, they begin to take ownership over their learning. You can type out student goals on small strips of paper, print them on labels to create “stickers” for students, or share them verbally. This process begins to shift the continuum of voice from teacher centered to learner centered.


You can use this framework each time you assess your class reading growth to create focused instruction for all your students.


How do you create reading groups? Share your thoughts in the discussion below!


View original blog post on Teach. Learn. Grow here.

About the Author

Amy Schmidt is a content designer on the Professional Learning Design team at NWEA. As a former K-2 classroom teacher, instructional coach, and curriculum designer, she passionately believes all children can and will learn. She loves creating professional learning that creates meaningful growth experiences for teachers and students.

Join the assessment and research experts at NWEA® and educational leaders across the state for the 2017 Kansas-Missouri Leadership Summit.

What is it?
This complimentary one-day event will help district-level leadership teams fully leverage data from MAP® Growth™ to inform critical decisions at every level.


Who is this event for?
K-12 leaders at the building and district level who use MAP Growth and who work to improve all aspects of district assessment. (i.e. superintendents, assistant superintendents, directors, assessment directors, principals, and other leaders)


Why should you attend?
  • Understand how a purpose-driven assessment program can help you regain control of your assessment program, reduce testing and drive student learning
  • Learn how district leaders use MAP Growth data to focus instruction and close achievement gaps and understand how Kansas and Missouri districts incorporate growth data for continuous improvement
  • Discover powerful new MAP Growth and MAP Skills enhancements and professional development offerings to help you maximize student growth

Date: Wednesday, November 8, 2017

Time: 9:00 a.m. - 2:45 p.m.

Venue Details: 

K-State Olathe
22201 W. Innovation Drive 
Olathe, KS 66061


Reply here if you plan to attend. Let us know what you hope to hear and see.

Join the assessment and research experts at NWEA and educational leaders across the state for the 2017 Michigan Leadership Summit.

What is it?

This complimentary one-day event will help district-level leadership teams fully leverage data from MAP Growth to inform critical decisions at every level.


Who is this event for?
K-12 leaders at the building and district level who use MAP Growth and who work to improve all aspects of district assessment. (i.e. superintendents, assistant superintendents, assessment directors, principals, and other leaders)


Why should you attend?
  • Understand how a purpose-driven assessment program can help you regain control of your assessment program, reduce testing and drive student learning
  • Learn how district leaders use MAP Growth data to focus instruction and close achievement gaps and understand how Michigan districts incorporate growth data for continuous improvement
  • Discover powerful new MAP Growth and MAP Skills enhancements and professional development offerings to help you maximize student growth

Date: Tuesday, October 24, 2017

Time: 8:00 a.m. - 3:00 p.m.

Venue Details:

VisTaTech Center

Schoolcraft College 
18600 Haggerty Rd 
Livonia, MI 48152


Reply here if you plan to attend. Let us know what you hope to hear and see.