I think that this is the technique Hollywood uses for action movies, only without the reconstruction software bit.
Sigh. Physics fail. The ball would see a constant 9.8m/s^2 acceleration downward during its entire trajectory, neglecting air resistance.
Sigh, engineering fail. An simple accelerometer measures the force between a proof mass and a fixed (to the accelerometer) point. When "falling" (defined in this case as free movement in a gravity field, including upward on a ballistic trajectory), the force between the proof mass and case will be zero (no net acceleration between the two, including no net gravity force difference) and the accelerometer will read zero, again, neglecting aerodynamic drag.
HOWEVER, if one paired the accelerometer with some processing to create a 3-axis IMU, then biased it with a negative vector sum bias equal to 9.8m/s, it would then read what you describe, and the integral term of the combined vector (net velocity) would be zero when held, increase during the throw/launch, then progressively decrease to near zero (apogee), increase again in the fall, then decrease to near zero on the catch/landing.
So a simple one or three ads accelerometer is insufficient, one needs to add some processing (creating a 3-axis IMU, rather than a 3-axis accelerometer).
The above is not valid if the acceleration or velocity is measured relative to some external datum (I.e. relativistically or Doppler rf of other transmitters nearby) as that would give you a measurement of acceleration (the relativistic measurement) or velocity (Doppler) that doesn't depend on relative local forces.
Given the above, the device likely uses a 3-axis accelerometer and a simple micro controller to do the processing.