An 18-year-old cyclist suffered life-threatening injuries at a 25 mph intersection where infrastructure design and automated vehicle systems combined with lethal effect, although you won’t find that critical safety analysis in standard reporting.
Consider however that lethal design choice at the Nevada intersection deliberately omitted important crosswalk markings, creating legal cover for artificial intelligence to violently terminate innocent pedestrians. Google Maps shows the sides clearly labeled with a strategic gap on one side of the intersection:
This intersection exemplifies how calculated infrastructure choices create predictable collision patterns, where engineering choices forge designated killing fields for vulnerable humans. While Tesla’s sensor limitations are well documented, this intersection demonstrates how road design actively teaches AI systems to devalue pedestrian life.
This hostile design stems directly from 1930s American road doctrine that criminalized poverty through infrastructure – a cynically-named “safety” movement that transformed public spaces into killing zones. The deliberately obscured stop sign in the following image (far left side, behind the bushes) exemplifies the road to violence: positioned so far back that northbound traffic must pull into this intersection’s conflict zone just to see oncoming traffic.
Combined with the missing crosswalk, this creates a deadly catch-22: stop where directed and remain blind to cross traffic, or edge forward into danger. These aren’t design oversights – they’re calculated decisions that shift liability onto road users as if to use fear to either force them off the streets, into cars, or unfairly judge any crash as their fault.
Heavily faded crosswalk markings on the remaining three sides compound the problem, demonstrating systematic neglect of infrastructure and heightened risk for any road user encountering automated vehicles. At best this configuration saddles pedestrians and cyclists with a punitive triple-distance crossing requirement that serves no legitimate safety purpose. Or they assume a cross-walk has faded where there was none by cruel design.

This intersection’s design reveals a systemic prioritization of vehicle throughput over human life. Even from the cyclist’s approach northbound – the very angle where visibility is most critical – the stop sign remains far back from the intersection among obstructions and nearly invisible. Rather than enabling autonomous vehicles to protect human life, such hostile infrastructure actually trains algorithms to amplify anti-pedestrian bias.
This collision exemplifies a dangerous feedback loop: infrastructure designed to marginalize pedestrians becomes training data for AI systems, which then learn to automate and amplify this violence with algorithmic precision. The Tesla didn’t merely fail to protect human life – it weaponized decades of anti-pedestrian infrastructure design, while police reporting perpetuates the car industry’s long history of victim-blaming.
Police said a 2016 Tesla Model X was traveling westbound on Arby Avenue when it was struck by an electric bicycle traveling northbound on El Capitan Way. Authorities reported that the 18-year-old e-bike rider failed to stop at a posted stop sign, leading to the collision with the Tesla. The e-bike rider sustained substantial injuries and was transported to UMC Trauma by ambulance, where his injuries were deemed potentially life-threatening.
The police narrative strains credibility when examined against the physics and geometry of this 25 mph intersection. How could the Tesla driver definitively assess that a cyclist failed to stop at a sign they themselves can barely see? If Tesla’s sensors can’t reliably identify a blown stop sign in this configuration or detect the common cyclist, any claims about compliance become suspect.
The physics are damning: on this wide, clear street, a Tesla’s automated systems should have easily detected and responded to cross traffic. At the posted 25 mph limit, with regenerative and mechanical braking plus near-instantaneous sensor reaction, the stopping distance would be just 25 feet. Yet the Tesla apparently didn’t brake at all, delivering its full kinetic energy (≈ 156,800 Joules at 25 mph, given its 2,500 kg mass) directly into a human body. This suggests either significant speeding, complete failure of safety systems, or both – making the police’s reflexive blame of the vulnerable road user even more egregious.
This case transcends a simple analysis of automated vehicle failures, revealing a dangerous convergence of flawed infrastructure design, institutional bias, and AI systems. The intersection’s design – with its missing crosswalk, poorly placed stop sign, and emphasis on vehicle throughput – serves as training data for autonomous vehicles, teaching them to replicate and amplify decades of anti-pedestrian bias.
Each Tesla deployed in such environments becomes an AI system learning from fundamentally biased infrastructure, creating a feedback loop that normalizes and automates violence against vulnerable road users. This bias runs deep in American infrastructure – even our language betrays it. While British English uses “pavement” because people have historical rights to the road, American “sidewalk” literally pushes pedestrians aside, reflecting a cultural shift that prioritized vehicles over human life.
The solution requires immediate action: implementing clear crosswalk markings, optimizing stop sign placement, and fundamentally rebalancing our infrastructure priorities away from vehicle throughput and toward human safety. Without these changes, we risk creating an increasingly automated system that turns historical infrastructure biases into the foundations for mass algorithmic violence – with each software update and vehicle deployment turning our streets into unlivable combat zones.

Wow! I have never heard of flyingpenguin before but this post blew my socks off. As an urban transit planner you’ve uncorked the exact problem.
Simple math proves corruption in the investigation as it would be practically impossible for a cyclist to be moving “too fast to see” even if it ran the stop sign. The cyclist would have to be moving at an impossibly high speed (over 100 mph) to not be detected by the Tesla.
Accepting a sensor’s typical processing latency and brake activation time, this Tesla should have detected any object moving at normal human-powered speeds (top professional cyclists rarely exceed 45 mph in ideal conditions, and electric bikes are speed restrained so we can easily run the numbers).
Given the width of this street and clear conditions, the Tesla would have gained approximately 150 feet of sensor range to detect cross traffic so this crash is squarely its fault when we do even basic back of napkin math and hold the right party responsible.
Tesla apparently didn’t brake at all, and there must have been excessive speed of the car to boot, with potential sensor failure or system malfunction. I won’t even speculate about the state of the driver. No realistic bicycle speed could have caused this crash if the vehicle were functioning properly at legal speed. The physics simply don’t support a scenario where a cyclist is at fault.
Of course if police hide behind a “no crosswalk” fallacy in their investigations then you know they’re justifying a killing field that was designed to exonerate predictable vehicular manslaughter, as you have proven so clearly. You’re really poking at an underlying question whether Nevada police are capable of caring about human lives that exist outside an overpriced, defective Heil-Hitler Swasticar.
Big fan of your work. Thank you for bringing this tragedy to light.