Monday, May 11, 2026

No clear evidence that the school smartphone ban policy reduced screentime

 This paper is the first to examine the causal effects of school smartphone bans on the mental health of youth in the US. Time series data show that the mental health of youth has been declining for the past decade. Several researchers argue that easy access to social media and other internet sites provided by smartphones is to blame. 

To provide causal evidence of the effects of these bans, the author relies on synthetic difference-in-difference models and the National Survey of Children’s Health (NSCH) from 2016 to 2024. Currently, there are data for only one state with two post-ban periods and two states with one post-ban period, which makes the results preliminary evidence only. 

The outcome variables are screentime and measures of psychological wellbeing. 

Overall, these early results provide no clear evidence that the school ban policy reduced screentime or improved psychological wellbeing. 

Future studies with additional years of data, when they are available, are needed to increase power and to estimate the longer-term effects of school bans on youth mental health.

Strategic Manipulation of University Grading Systems

 When do university grades permit informative comparisons across courses, and how does transcript adjustment affect student and instructor incentives? A raw grade mixes student performance with course-specific conditions, so grade-only comparisons fail whenever course effects are large enough to reverse ability rankings at grade cutoffs. 

This study shows that full transcripts can recover comparable student signals through what we call eigengrades: course-adjusted reports that use common or externally anchored grading standards and enrollment overlap to identify centered student effects. In the scalar additive benchmark, row-mean, affinity-spectral, and graph-Laplacian methods recover the same object. Eigengrades are, therefore, not a separate source of identification; they are a representation of fixed-effect adjustment. 

The framework also clarifies limits: ordinary letter grades with unanchored course-specific cutoffs do not separate course difficulty from grading standards, and multidimensional transcripts identify a skill-match subspace rather than a unique universal ranking unless the institution specifies a benchmark. 

Finally, exact difficulty adjustment removes the direct report-mediated incentive to choose easier courses and eliminates a competitive enrollment channel behind grade inflation, while leaving other strategic and governance margins intact.

Lessons from the First Statewide Mandate on School Start Times

This study examines the impact of California’s Senate Bill 328 (SB 328), the first statewide mandate requiring later school start times for middle and high schools, on adolescent sleep, mental health, and academic outcomes. 

The authors find that SB 328 increased the share of students sleeping at least 8 hours per night by 13%, meeting the CDC-recommended minimum for this age group. 

Average mental health effects are imprecisely estimated, but boys show significant reductions in sadness, hopelessness, and suicidal ideation, and Hispanic students, who experienced the largest sleep-timing shifts, show parallel reductions in difficulty concentrating; together these patterns are consistent with a dose-response relationship between sleep improvement and mental well-being. 

Math and English scores in grade 8 improved by approximately 0.08–0.10 standard deviations, with the largest gains among Hispanic and economically disadvantaged students. 

A within-state analysis using teachers’ commute arrival times as a proxy for pre-policy school start times corroborates these findings, and shows academic gains accumulating over 2023–2025 alongside a suggestive decline in high school dropout rates. 

The absence of effects on chronic absenteeism rules out an attendance-driven mechanism, pointing instead to the direct cognitive benefits of aligning school schedules with adolescents’ biological rhythms.

Gender Gaps in Education and Declining Marriage Rates

 Over the past half-century, U.S. four-year colleges have shifted from enrolling mostly men to enrolling mostly women, while the economic position of non-college men has weakened markedly. This study examines how these changes correspond with the evolving structure of marriage markets across cohorts and places. 

As college men have become increasingly scarce, college women have maintained stable marriage rates by marrying high-earning non-college men. This shift—combined with the broader economic decline of non-college men—has sharply reduced the pool of economically stable partners available to non-college women: the share of non-college men who earn above the national median and are not married to college women has fallen by more than 50%. 

Cross-area evidence shows that education gaps in marriage are smaller where non-college men face lower rates of joblessness and incarceration. 

Taken together, the evidence suggests that deteriorating outcomes for men have primarily undermined the marriage prospects of non-college women.

Saturday, May 9, 2026

Smartphone app + personal coaching improves college student mental health

 A study of more than 6,200 university students, including some at WashU, found that a smartphone app combined with personal coaching via text messages can be an effective intervention against depression, anxiety, and eating disorders.

For the students in the study — all of whom were identified through college-wide screening as being at high risk for or as having a mental health condition — the digital approach proved more effective than a referral to campus counseling services, the typical next step for students who show signs of mental health struggles. 

Compared with students who received a referral, those who were offered the app reported fewer symptoms of mental health problems in follow-up testing six weeks, six months, and two years later. They were also more likely to be free of any mental health disorders.

“Universities like WashU already have excellent mental health services, but not all students will take the steps to make an appointment,” said Denise Wilfley, the Scott Rudolph University Professor and a professor of psychological and brain sciences. “We were able to offer students an effective resource that they could download on their phones right then and there.”

Wilfley was the senior author of the study published in Nature Human Behavior. Ellen Fitzsimmons-Craft, an associate professor of psychological and brain sciences and an associate professor of psychiatry, was a co-first author. Michelle Newman of Penn State and Daniel Eisenberg of UCLA were also co-authors. 

The app is designed to deliver a digital version of cognitive behavioral therapy (CBT), a well-established therapeutic approach that aims to identify and change the negative thought and behavior patterns that can drive anxiety, depression, and eating disorders.

Responding to prompts, users completed interactive modules where they received psychoeducational content and engaged in exercises to help them learn and practice the content. The coaches could then review their progress and provide personalized responses and feedback. “The coaches help students implement the things they're learning through the mobile app,” Fitzsimmons-Craft said. “They provide feedback on progress and get students thinking about what they’re doing to achieve positive change.”

The app’s accessibility turned out to be a major advantage. Nearly 75% of students randomly chosen to receive the app used it at least once. In contrast, only 30% of students who received a referral to campus mental health services reported receiving any mental health treatment in the following six months. The accessibility advantage of the app was evident for all student groups, including those from disadvantaged backgrounds and those who generally face greater barriers to care. “Having something right on their phone made a big difference for students,” Wilfley said.

Campus-based counseling services, including those offered at WashU, are still an invaluable resource for students, Wilfley said. “We're not using digital tools to replace counseling services,” she said. “We’re developing a way to make evidence-based intervention available to as many students as possible. We’re removing barriers to care.”

Unlike some other digital mental health platforms, the app used in the study doesn’t run on artificial intelligence. That’s an important distinction, because generative AI-based therapy remains largely untested and carries certain risks, including the possibility of misinformation and harmful advice. In November 2025, the American Psychological Association recommended against the use of generative AI chatbots and wellness apps as a replacement for standard mental health care.

Artificial intelligence could still be an important tool for addressing mental health concerns on campus. Leading a team that includes Wilfley, Fitzsimmons-Craft is the principal investigator on a five-year, $3.7 million grant from the National Institutes of Health (NIH) to develop a self-guided, chatbot-based digital intervention designed to help students with eating disorders. The chatbot uses carefully controlled rules-based AI, not generative AI.

Student mental health should be a top concern for campuses around the country, Fitzsimmons-Craft said. In the current study, nearly half of the 39,194 students who completed initial screening were identified as either having or being at high risk for depression, anxiety, or an eating disorder. In addition to the physical and emotional toll, such conditions can make it difficult or impossible for students to succeed academically, she said.

“Many students wait until they reach a crisis point to reach out to the counseling center,” Fitzsimmons-Craft added. “By pairing screening with immediate access to the app, students have an opportunity to take a more proactive approach to their mental health.”

Wilfley, Fitzsimmons-Craft, and colleagues are now working to make the app available to all students who are struggling with mental health. “Sometimes evidence-based research can be locked away for many years before it reaches the public,” Wilfley said. “Digital interventions should be available to everybody who needs it. The fact that this study started with large-scale screening on college campuses shows the potential for reaching large populations. ”

Given the prevalence of mental health disorders on campuses across the country, it would make sense for colleges and universities to screen all incoming freshmen for anxiety, depression, and eating disorders, Wilfley said.  

This work demonstrates that the combination of population-based mental health screening and digital interventions can not only reduce psychiatric symptoms and improve quality of life but also prevent psychiatric disorders. "This approach can simultaneously reduce the prevalence of mental disorders, expand equitable access to care, and improve affected individuals’ symptoms," Wilfley said.

“WashU already has a program that promotes awareness about alcohol use disorders, which, of course, is an extremely important issue,” Wilfley said. “But universities could also take a more proactive approach to mental health.”

Wednesday, May 6, 2026

How are teachers reckoning with AI in schools?


Artificial intelligence has swept into American schools, and more is sure to come. This year, both Google and Microsoft — the two biggest companies at the forefront of the AI boom — announced major investments in AI training for teachers. 

But what do teachers think of this transformation of their work?

Katie Davis, a University of Washington professor in the Information School and co-director of the Center for Digital Youth, studies how technology affects young people’s learning and development. Davis has also been teaching for over two decades — first as an elementary school teacher and now as a professor — so she’s acutely aware of how earlier technological revolutions in teaching have not always played out as hoped.

Davis and a UW-led team of researchers interviewed 22 teachers in Aurora Public Schools in Colorado — a district that’s investing heavily in AI through systems like Google’s Gemini and MagicSchool, an AI tool that helps teachers plan. Overall, teachers were ambivalent about the technology. They liked that it could reduce workload, especially for rote tasks, but worried that it could erode the social aspects of teaching.

The team presented its research April 15 at the Association for Computing Machinery Conference on Human Factors in Computing Systems in Barcelona.

UW News talked with Davis about the study and how ostensibly democratizing technologies can widen disparities in schools. 

Why did you want to study AI adoption by schools?

Katie Davis: At least since the introduction of the radio, every new technological invention has been hyped for how it will change teaching and learning. Computers are the prototypical example. They were pushed into schools only to start collecting dust, because they didn’t really change anything. We saw it with massive open online courses, too. Ten or 15 years ago, these courses were supposed to transform education and put colleges and universities out of business. But that hasn’t happened.

Often the hype centers on closing educational inequities. But these new technologies actually tend to aggravate existing inequities. The schools serving the most affluent students have the resources to think carefully about how to incorporate technologies into their curriculum so that they’re supporting student learning goals and outcomes, whereas more under-resourced schools don’t have the resources or the time to do that kind of work. So they end up incorporating technologies in ways that don’t necessarily help students learn; instead, they make things more efficient or keep track of students.

When AI started being intensely pushed into schools, I thought here we go again. AI is here and it’s not going anywhere, so I would love for us to understand how it’s being taken up in schools and, ideally, to prevent this recurring pattern.

What did you hear from teachers about AI?

KD: Teachers expressed a deep ambivalence toward AI. It wasn’t as if any one teacher said it’s all great or it’s all terrible. I think the single strongest driver for teachers to use AI was to prevent burnout. Teachers are being asked to do more and more — not just teach, but care for students’ entire emotional, cognitive and academic lives. It really weighs on them. So a lot of them talked about turning to AI to be a thought partner, to help them brainstorm lesson ideas, create assessments and differentiate lessons for different learners.

Another really big benefit for this particular school district was multilingual support. The district serves students who speak more than 160 languages. One teacher we spoke with said she had four main languages represented in her classroom but she only spoke English, so she was turning to AI to help her translate materials for her students and for their families so that she could communicate with them. 

I think it’s really important to note that this district is going all in on AI. They’re encouraging teachers to use it and providing professional development, and teachers are talking among themselves and sharing ideas. This kind of institutional support and more informal teacher conversations are also encouraging teachers to use AI and explore how they might incorporate it into their teaching practice.

AI is often presented as a democratizing technology, but a Financial Times story recently showed that higher wage earners are using AI more than lower wage earners in the same industry — possibly increasing disparities. Are you seeing anything like that playing out in education?

KD: The way that manifests in education is in the kinds of support that students have access to. It’s more likely that better-resourced schools are also going to provide some form of AI literacy instruction — to really engage students in thoughtful reflection about what AI is, how it may or may not be useful for their learning, and to actually get them to think about these issues in a deep way. Whereas in under-resourced schools, the easiest thing to do is to just block AI. That’s not going to prevent students from using it, but they will end up using it in a communication vacuum, without any adult guidance. You can see how that would create disparities in how well students can use it.

I was really interested in the finding that teachers are concerned that students will know they’re using AI.

KD: That is one of the most interesting findings for me. Teachers are definitely aware that if their students think they’ve used AI, students and their parents will feel that their teachers are cheating them out of a proper education. Teachers are very worried about both students and their more AI-resistant colleagues seeing them that way. I don’t think this is unique to teachers — I feel it in university jobs, too. Many people have this perception that using AI is cheating or taking the easy way out. 

But there’s another layer: Teachers are personally worried about their own authentic voice and professional identity. They’re asking, “If I am using AI, at what point am I no longer a teacher? Where’s that line between using AI as a thought partner to augment my professional practice versus it now replacing my professional practice?” 

What are ways schools might amplify the positive parts of using AI while mitigating some of these negative effects?

KD: One of the first things is to bring AI out of the shadows and talk about it. Since we published this piece, I’ve been engaging with groups of teachers around the country in professional development experiences around AI, and they really enjoy having a community of practice. They feel that those spaces don’t necessarily exist in their schools. It’s like there’s this vacuum of communication — students don’t talk about it because they’re implicitly getting the message that it’s not OK to use it, and it’s the same with teachers.

Professional development is also very important. But a lot of professional development for teachers is just one-off PowerPoint presentations. It doesn’t really connect to whatever is going on in the classroom. Professional development needs to be done in a sustained way that meaningfully connects AI to teachers’ immediate classroom experiences.

School leaders need to be able to communicate AI policies, so that teachers are aware of them and understand how they apply in their specific schools. If you take Washington state as an example, the Office of Superintendent of Public Instruction has a really great blueprint and guidance for using AI. But my sense is that not many teachers are aware of it, or even if they are, there hasn’t been any concerted effort to say, “OK, this is what that means in our school.” We need to be working at many levels to make sure that AI is integrated into education well. 

Is there anything you want to add?

KD: Something I hold very dear as a teacher is that teaching is relational. Kids don’t learn in isolation. The CEO of Khan Academy gave a TED Talk saying the ideal vision is for every kid on the planet to have their own personal AI tutor and for every teacher to have their own personal AI teaching assistant. Maybe that would be great, but I worry that this push toward AI will erode the relationships between teachers and students. Teaching and learning are social processes. It’s not just about putting information into a student’s brain. Students learn through dialog, through participation in cultural practices. To remove that element of learning really concerns me.


Schools must do more to help girls master AI

Schools must do more to help girls master AI. That’s the conclusion of a new study, which found that boys, more confident at working with AI, performed better in some classes compared to their female counterparts.

The researchers, who carried out the study in Qatar, recommend that AI to be taught in primary schools – and that teachers show students how to use AI tools, to help with their schoolwork.

“AI is rapidly making its mark in almost every sector of the economy and will continue to increase its influence, making it vital that young people are equipped with the skills to thrive in an AI-driven future,” says researcher Dr. Zubair Ahmad, of Qatar University Young Scientists Center (QUYSC).

“However, previous research has shown that students often struggle to master the fundamentals of AI concepts. We wanted to know why.

“We found that students who believe in their ability to learn and use AI tend to do better in the subject. This link between confidence and results is much stronger in boys than in girls.

“Good teaching and access to resources are also important, although their impact was stronger for boys than for girls.

“By building girls’ belief in their ability to master AI, schools can help ensure that both genders do well in the subject and help prepare the next generation for life in a world that is being rapidly transformed by AI.”

For the study, published today in peer-reviewed journal Cogent Education, Dr Ahmad and colleagues developed a 35-question survey that explored the relationship between students’ confidence and belief in their ability to learn and use AI (AI efficacy), how much they had learnt (AI learning outcomes) and how much support they got from their teacher and school (institutional support).

The questionnaire was completed by 743, 15- to 18-year-olds who were studying computing and IT at high school in Qatar. Participants comprise both Qatari nationals and students from diverse Asian and African backgrounds.

Analysis of their answers showed that students who were more confident at AI did better in the subject, particularly if they were boys.

Students who received higher levels of institutional support – including teacher guidance, hands-on learning experiences, and access to educational resources – demonstrated better AI learning outcomes, with this effect being significant for male students but not for female students.

It is thought that students who are more confident at AI are more likely to persevere when the topic gets difficult, while those with a belief in their abilities may shy away from tricky areas or give up.

Good teaching, meanwhile, can inspire and motivate and so help turn confidence into results.

Dr Ahmad says there are several possible reasons why the links between confidence, institutional support, and learning outcomes were stronger in boys than in girls.

“Technology and AI are often perceived as a male-dominated domain which can influence students’ belief in their abilities and their engagement in the subject,” he explains.

“And, as a result female students may have lower confidence in their abilities or be less likely to experiment with AI tools.

“The teaching style may matter, too. We know, for example, that some students prefer very structured lessons, others thrive on the combination of gentle guidance and the freedom to explore.”

The study’s findings can be used to amend curricula, to boost students’ AI self-efficacy and their AI skills, say the authors.

They suggest that:

  • Students should be taught the basics of AI from early stages of school education. More complex concepts can be introduced progressively across higher grade levels.
  • Lessons should be interactive, with students asking questions, doing hands-on work, and solving real-world problems, rather than passively listening.
  • Teachers should give students feedback soon after they complete a task, while it is still fresh in the students’ minds.
  • Educators should also teach students about how to use AI ethically. This could include showing them how to use AI tools to help with their schoolwork without cheating or being dishonest.

“Schools should also do more to support girls specifically,” says Dr Ahmad. “This could be done by providing them with more female role models in AI, creating a classroom environment in which all students feel comfortable, so that girls perceive that they are equally supported, which will boost their belief in their abilities to engage with AI.

“One way of doing this is through what we call guided practice. An example would be a teacher demonstrating how to use an AI tool, then allowing students to practise, while giving them guidance when needed. The teacher then gradually reduces the level of support as the students become more proficient. Such approaches will ensure that students, particularly girls, feel supported throughout the learning process.

“This builds skills, while also boosting confidence.”

A limitation of the study, includes that it has “conceptualized institutional support with a broader coverage of the AI learning aspect”.

The authors recommend future research may investigate the influence of various aspects of AI learning individually.

10.1080/2331186X.2026.2625448