Yes mate, although I've not done any work/research in that area specifically. It's similar(ish) to some light probe baking stuff I did way back where we used spherical harmonics and probe/point captures to capture light samples within a given 3D environment. 3D Gaussian splatting uses a somewhat novel technique to present the image though; where each captured point is represented with a blob (or Gaussian particle) - each blob has a position in 3D space, covariance parameters (how it is stretched and scaled), alpha (how see-through it is), and a colour. The blobs are rendered depth-sorted and can give cool results. As long as the scene is static! Of course, I have massively simplified things here as there is quite a bit more to it than rendering blobs to the screen! Such as the point cloud data capture and the "training" that takes the data and synthesises it into something visually meaningful.
It's a very different algorithm and technique to existing rasterisation render pipelines, though - so it's not particularly trivial to shoehorn into existing setups and generally needs a bespoke renderer creating. There are, of course, several examples available online though. It will be interesting to see where it goes in the future.