برچسب: mixed

  • Nearly all school parcel taxes pass, but mixed results for school bonds in March election

    Nearly all school parcel taxes pass, but mixed results for school bonds in March election


    The March 5 primary proved to be a good day for passing school parcel taxes, but not so good for school construction bonds.

    With fewer than 1% of votes statewide remaining to be counted, it appears likely voters in 10 of 11 districts approved parcel taxes. Although a small sample size, the 91% passage rate beats the historic 65% pass rate for primary elections, according to Michael Coleman, who publishes election results at CaliforniaCityFinance.com (see note below). The sole defeat was the Petaluma Joint Union High School District’s eight-year proposed tax at $89 per parcel.

    Voters in 24 of 40 school districts passed school facilities bonds: 60% compared with the historic 73% primary election approval rate. And the winners include two tiny school districts in Sonoma County that looked like they would be defeated on election night but picked up enough mail-in or provisional votes to eke out a win.

    It takes a 55% majority vote to pass a bond, and in Fort Ross School District, two votes made the difference for the $2.1 million bond; the 158 to 126 vote was 55.6% to 44.3%.  Supporters of the $13 million bond in the Harmony Union School District picked up 6 percentage points since election night to end with 56.3% of the vote.

    School districts can choose the March primary or November general election for a parcel tax or school bond. Most traditionally choose November, when more voters cast votes. But others gamble on the primary election, when there’s less competition, with fewer state bond issues and many initiatives competing for dollars on the ballot.

    The most recent proposal for a state school construction bond, which would have provided matching funding for local school bonds, was also on the statewide primary ballot in March 2020, and it lost — the first in decades to lose. But it coincided with the emergence of the Covid pandemic, adding an edge of anxiety for voters. It also had the misfortune of coincidentally being designated Proposition 13, which likely caused confusion among voters with the 1978 anti-tax initiative that substantially restricted property tax increases and required a two-thirds voter majority to pass new taxes, including parcel taxes. (Voters lowered that threshold for school facilities bonds to 55% with Proposition 39 in 2000.)

    The Legislature and Gov. Gavin Newsom’s aides are negotiating whether to place a school facilities bond proposal on the November ballot. With student enrollment declining statewide, most of the money would be designated for renovations and repairs, not new construction.

    Brianna Garcia, vice president of School Services of California, a school consulting company, doubted that the lower-than-average passage rate for bonds would predict the outcome in November for local and state bond proposals. Many more districts will place bonds before voters, and the passage rate will revert to the norm for November elections, which is over 80%, she said.

    While agreeing with Garcia, Eric Bonniksen, superintendent of Placerville Elementary School District in El Dorado County, cautioned that people struggling financially “are looking at every avenue to fit within their budgets, including school bonds.”  A drop in interest rates, even if not large, which economists are forecasting, “may make people feel better about the economic outlook,” he said

    Voters, Bonniksen said, want to see something visible, like remodeling a building, reconstructing a field or painting a school. “If a bond only fixes sewer and electrical lines, they will question, ‘What did you do for this money?’” he said.

    Voters passed about $3 billion worth of projects, not including interest, generally paid over 30 years at rates of $15 per $100,000 of assessed property value in Sunnyvale to $60 per $100,000 of assessed property value in Benicia, Hayward, Culver City and Desert Sands unified districts. The largest bonds approved are for $675 million in Desert Sands, $550 million in Hayward, and $358 million in Culver City.

    The largest bond that failed was for $517 million in Tamalpais Union High School District in Marin County; as of March 22, it was 1.25 percentage points shy of 55%. Opponents, led by the Coalition of Sensible Taxpayers, questioned the scale of the work and said the money would disproportionately go to Tamalpais High, with not enough to two other high schools. The district last approved a construction bond two decades ago.

    Parcel taxes

    Only about 1 in 8 school districts, primarily in the Bay Area and districts with wealthier families in the Los Angeles area, have passed one. Parcel taxes are one of the few sources of funding for districts to supplement state or local funding. Because Proposition 13 bans tax increases based on a property’s value, parcel taxes must be a uniform amount per property, regardless of whether it’s a cottage, a 10-bedroom house, or an apartment building.

    Courts have ruled, however, that parcel taxes can be assessed by the square footage, and three of the 11 on the ballot (54 cents per square foot per year in Berkeley Unified, 55 cents in Albany Unified, and 58.5 cents in Alameda) passed. School boards in high-cost Bay Area districts argue that parcel taxes are critical because state funding under the Local Control Funding Formula doesn’t take regional costs into consideration.

    The approved parcel taxes range from $75 per year for eight years in Martinez Unified to a $768 per year extension of an existing parcel tax, with an annual cost of living adjustment, in Davis Joint Unified.

    Note: Updated data indicated that parcel taxes in Manhattan Beach Unified and Petaluma City Elementary School District, along with bond proposals in Fort Ross and Harmony Union school districts picked up enough support to pass.





    Source link

  • California college professors have mixed views on AI in the classroom

    California college professors have mixed views on AI in the classroom


    Cal State Long Beach lecturer Casey Goeller wants his students to know how to use AI before they enter the workforce.

    Tasmin McGill/EdSource

    Since Open AI’s release of ChatGPT in 2022, artificial intelligence (AI) chatbots and models have found their way into the California college systems. These AI tools include language models and image generators that provide responses and images based on user prompts.

    Many college professors have spoken out against AI’s use in college coursework, citing concerns of cheating, inaccurate responses, student overreliance on the tool, and, as a consequence, diminished critical thinking. Universities across the U.S. have implemented AI-detecting software like Turnitin to prevent cheating through the use of AI tools.

    However, some professors have embraced the use of generative AI and envision its integration into curricula and research in various disciplines. To these professors, students learning how to use AI is critical to their future careers.

    An October 2024 report from the University of Southern California’s Marshall School of Business found that 38% of the school’s faculty use AI in their classrooms.

    Ramandeep Randhawa, professor of business administration and data science at USC, was one of the report’s 26 co-authors and organized the effort. 

    “As companies increasingly integrate AI into their workflows, it is critical to prepare students for this AI-first environment by enabling them to use this technology meaningfully and ethically,” Randhawa said. “Universities, as bastions of knowledge, must lead the way by incorporating AI into their curricula.”

    All in on AI

    At California State University, Long Beach, gerontology lecturer Casey Goeller has incorporated AI into his course assignments since fall 2023.

    Students enter Goeller’s Perspectives on Gerontology course with various levels of experience with AI. By asking students for a show of hands, Goeller estimates the class is usually evenly split, with some students having no experience, others having dabbled with it and some who have used it extensively.

    Goeller aims to help students understand how AI can be beneficial to them academically, whether it be assisting with brainstorming, organizing, or acting as a 24/7 on-call tutor.

    To achieve this, Goeller’s assignments include students using an AI tool of their choice to address his feedback on their essays based on criteria such as content, flow and plagiarism concerns. Another assignment, worth 15% of their grade, emphasizes the importance of prompt engineering by having students use AI-generated questions to interview an older person in their life.

    While Goeller gets a lot of questions from fellow faculty members about how AI works and how to implement it, he also hears plenty of hesitation.

    “There’s a lot of faculty who’s still riding a horse to work, I call it,” Goeller said. “One of them said, ‘I am never going to use AI. It’s just not going to happen.’ I said, ‘What you should do if you think you can get away with that is tomorrow morning, get up really early and stop the sun from coming up, because that’s how inevitable AI is.’”

    Goeller heeds the difficulties in establishing a conclusive way to incorporate AI into curricula due to different academic disciplines and styles of learning, but he does recognize the growing presence of AI in the workforce. Today, AI is filling various roles across industries, from analyzing trends in newsrooms and grocery stores, to generating entertainment, a point of contention for SAG-AFTRA members during 2023’s Hollywood strikes.

    “If we don’t help our students understand AI before they escape this place, they’re going to get into the workforce where it’s there,” Goeller said. “If they don’t know anything about it or are uncomfortable with it, they’re at a disadvantage compared to a student with the same degree and knowledge of AI.”

    California State University, Northridge, journalism lecturer Marta Valier has students use ChatGPT to write headlines, interview questions and video captions in her Multimedia Storytelling and Multi-platform Storytelling classes due to the inevitability of AI in the workforce.

    The goal of the implementation is to teach students how AI algorithms operate and how journalists can use AI to assist their work. Not using it, she said, “would be like not using ink.”

    “I absolutely want students to experiment with AI because, in newsrooms, it is used. In offices, it is used,” Valier said. “It’s just a matter of understanding which tools are useful, for what and where human creativity is still the best and where AI can help.”

    AI tools such as ChatGPT and Copilot are frequently updated, so Valier emphasizes flexibility when teaching about these technological topics.

    “I basically change my curriculum every day,” Valier said. “I think it reminds me as a professional that you need to constantly adapt to new technology because it’s going to change very fast. It’s very important to be open, to be curious about what technology can bring us and how it can help us.”

    However, Valier acknowledges the issues of AI in terms of data privacy and providing factual responses. She reminds students that it is their responsibility to make sure the information ChatGPT provides is accurate by doing their own research or rechecking results, and to avoid reliance on the platform.

    “Be very careful with personal information,” Valier said. “Especially if you have sources, or people that you want to protect, be very careful putting names and information that is sensitive.”

    Valier sees a clear difference in the quality of work produced by students who combine AI with their own skills, versus those who rely entirely on artificial intelligence.

    “You can tell when the person uses ChatGPT and stays on top of it, and when GPT takes over,” Valier said. “What I am really interested in is the point of view of the student, so when GPT takes over, there is no point of view. Even if [a student] doesn’t have the best writing, the ideas are still there.”

    Balancing AI use in the classroom

    Many AI-friendly instructors seek to strike a balance between AI-enriched assignments and AI-free assignments. 

    At USC, professors are encouraged to develop AI policies for each of their classes. Professors can choose between two approaches, as laid out in the school’s instructor guidelines for AI use: “Embrace and Enhance” or “Discourage and Detect.”

    Bobby Carnes, an associate professor of clinical accounting at USC, has adopted a balance between both approaches while teaching Introduction to Financial Accounting. 

    “I use it all the time, so it doesn’t make sense to tell (students) they can’t use it,” Carnes said.

    An avid user of AI tools like ChatGPT, USC associate professor of clinical accounting Bobby Carnes encourages AI experimentation for some assignments, but prohibits students from using it on exams. (Christina Chkarboul/EdSource)

    Carnes uses AI to refine his grammar in personal and professional work and to develop questions for tests. 

    “I give ChatGPT the information that I taught in the class, and then I can ask, ‘What topics haven’t I covered with these exam questions?’ It can help provide a more rich or robust exam,” Carnes said.

    He doesn’t allow students to use AI in exams that test for practical accounting skills, though. 

    “You need that baseline, but we’re trying to get students to be at that next level, to see the big picture,” he said.

    Carnes said he wants his students to take advantage of AI tools that are already changing the field, while mastering the foundational skills they’ll need to become financial managers and leaders. 

    “The nice thing about accounting is that the jobs just become more interesting (with AI), where there’s not as much remedial tasks,” Carnes said. 

    Preserving foundational learning

    Olivia Obeso, professor of education and literacy at California State Polytechnic University, San Luis Obispo, believes establishing foundational knowledge and critical thinking skills through AI-free teaching is non-negotiable.

    Obeso enforces her own no ChatGPT/AI usage policy in her Foundations of K-8 Literacy Teaching class to prepare her students for challenges in their post-collegiate life.

    “AI takes out the opportunity to engage in that productive struggle,” Obeso said. “That means my students won’t necessarily understand the topics as deeply or develop the skills they need.”

    Obeso is also concerned about ChatGPT’s environmental impact: For an in-class activity at the start of the fall 2024 semester, she asked students to research the software’s energy and water use. 

    The energy required to power ChatGPT emits 8.4 tons of carbon dioxide per year, according to Earth.Org. The average passenger vehicle produces 5 tons per year. Asking ChatGPT 20-50 questions uses 500 millliters (16.9) ounces of water, the size of a standard plastic water bottle.

    By the end of the exercise, Obeso said her students became “experts” on ethical considerations concerning AI, sharing their findings with the class through a discussion on what they read, how they felt and whether they had new concerns about using AI. 

    “You are a student and you are learning how to operate in this world, hold yourselves accountable,” Obeso said. 

    Jessica Odden, a senior majoring in child development, said Obeso’s class helped them understand AI use in the classroom as an aspiring teacher.

    “For people that are using (AI) in the wrong ways, it makes people reassess how people might be using it, especially in classes like this where we are training to become teachers,” Odden said. “What are you going to do when you actually have to lesson-plan yourself?” 

    Odden makes sure she sticks to learning the fundamentals of teaching herself so that she will be prepared for her first job.

    AI in curricula

    At the University of California, San Diego, some faculty members have echoed a concern for AI’s infringement upon independent learning. 

    Academic coordinator Eberly Barnes is interested in finding a middle ground that incorporates AI into curricula where it complements students’ critical thinking, rather than replaces it.

    Barnes oversees the analytical writing program, Making of the Modern World (MMW), where her responsibilities include revising the course’s policy of AI use in student work.

    The current policy enables students to use AI to stimulate their thinking, reading and writing for their assignments. However, it explicitly prohibits the use of the software to replace any of the aforementioned skills or the elaboration of the written piece itself.

    Despite the encouraged use of AI, Barnes expressed her own hesitancy about the role of AI in the field of social sciences and the research and writing skills needed to work within it. 

    “One of the goals in MMW is to teach critical thinking and also to teach academic writing. And the writing is embedded in the curriculum. You’re not going to learn to write if you’re just going to machine,” Barnes said. “The policy is inspired by the fact that we don’t think there’s any way to stop generative AI use.”

    When Barnes designs the writing prompts for the second and third series in the MMW program, she collaborates with teaching assistants to make assignment prompts incompatible with AI analysis and reduce the likelihood that students will seek out AI’s help for passing grades.

    “Students feel absolutely obsessed with grades and are very pressured to compete,” Barnes said. “That’s been around. I mean it is definitely worse here at UCSD than it was at other colleges and universities that I’ve been at.”

    A tool, not a cheat code

    Dr. Celeste Pilegard

    Celeste Pilegard is a professor of cognitive science and educational psychology at UCSD. She has been teaching introductory research methods since 2019, focusing on foundational topics that will prepare students for higher-level topics in the field.

    Educators like Pilegard have been struggling to adapt after the widespread adoption of AI tools. 

    “For me and a lot of professors, there’s fear,” Pilegard said. “We’re holding onto the last vestiges, hoping this isn’t going to become the thing everyone is using.”

    Pilegard is concerned that students rely on AI tools to easily pass their intro-level courses, leaving them without a firm understanding of the content and an inability to properly assess AI’s accuracy.

    “It’s hard to notice what is real and what is fake, what is helpful and what is misguided,” Pilegard said. “When you have enough expertise in an area, it’s possible to use ChatGPT as a thinking tool because you can detect its shortcomings.”

    However, Pilegard does believe AI can assist in learning. She likens the current situation with AI to the advent of statistical analysis software back in the 1970s, which eliminated the need to do calculations by hand. 

    At that time, many professors argued for the importance of students doing work manually to comprehend the foundations. However, these tools are now regularly used in the classroom with the acceptance and guidance of educators. 

    ”I don’t want to be the stick in the mud in terms of artificial intelligence,” Pilegard said. “Maybe there are some things that aren’t important for students to be doing themselves. But when the thing you’re offloading onto the computer is building the connections that help you build expertise, you’re really missing an opportunity to be learning deeply.”





    Source link