'Posterior Concentrations of Fully-Connected Bayesian Neural Networks with General Priors on the Weights', by Insung Kong, Yongdai Kim.
http://jmlr.org/papers/v26/24-0425.html
#priors #sparse #bnn
'Posterior Concentrations of Fully-Connected Bayesian Neural Networks with General Priors on the Weights', by Insung Kong, Yongdai Kim.
http://jmlr.org/papers/v26/24-0425.html
#priors #sparse #bnn
'High-Dimensional L2-Boosting: Rate of Convergence', by Ye Luo, Martin Spindler, Jannis Kueck.
http://jmlr.org/papers/v26/21-0725.html
#boosting #lasso #sparse
Pennsylvania’s No. 1 pasta dish is this https://www.diningandcooking.com/2126109/pennsylvanias-no-1-pasta-dish-is-this/ #backgrounds #BowTiePasta #contemporary #continuity #Cooking #CookingTopics #design #elegance #FabricSwatch #food #FoodAndDrink #FoodBackgrounds #Green #IllustrationsAndVectorArt #MultiColored #Orange #ornate #Pasta #pattern #Penne #Ravioli #repetition #RetroRevival #Rigatoni #seamless #simplicity #sparse #Tortellini #Vector #VectorBackgrounds #WallpaperPattern #WrappingPaper #yellow
Balsamico does not have to come from Italy, ECJ rules https://www.diningandcooking.com/2113159/balsamico-does-not-have-to-come-from-italy-ecj-rules/ #BuffaloMo #CapreseSalad #dieting #HealthyEating #Italia #Italian #ItalianVinegar #italiano #italy #seasoning #sparse #Vinegar
'The Effect of SGD Batch Size on Autoencoder Learning: Sparsity, Sharpness, and Feature Learning', by Nikhil Ghosh, Spencer Frei, Wooseok Ha, Bin Yu.
http://jmlr.org/papers/v26/23-1022.html
#sgd #autoencoder #sparse
'Extremal graphical modeling with latent variables via convex optimization', by Sebastian Engelke, Armeen Taeb.
http://jmlr.org/papers/v26/24-0472.html
#multivariate #graphical #sparse
'Rank-one Convexification for Sparse Regression', by Alper Atamturk, Andres Gomez.
http://jmlr.org/papers/v26/19-159.html
#sparse #lasso #convexification
pxp – tats=3D94chliche-chaos
#blips #datadriven #experimentalelectronic #extremecomputermusic #farmersmanual #generateandtest #internetdreaming #networksound #sparse #synthvoice #Berlin
CC BY (#CreativeCommons Attribution) #ccmusic
https://farmersmanual.bandcamp.com/album/tats-3d94chliche-chaos
Interprocedural Sparse Conditional Type Propagation
https://railsatscale.com/2025-02-24-interprocedural-sparse-conditional-type-propagation/
[New Python code: PyNoiselet] About 15 years ago, I wrote a simple set of matlab functions to compute the #Noiselet transform of Coifman et al (R. Coifman, F. Geshwind, and Y. Meyer, "Noiselets", *Applied and Computational Harmonic Analysis*, 10(1):27–44, 2001). The noiselet transform is used in #CompressiveSensing applications as well as in #Sparse signal coding as noiselets have minimally low coherence with wavelet bases (Haar and Daubechies), which is useful for sparse signal recovery.
Today, from a code request received yesterday by email, I decided to quickly rewrite this old code in Python (with the useful help of one LLM I admit).
Here is the result if you need an O(N log N) (butterfly like) algorithm to compute this transformation:
https://gitlab.com/laurentjacques/PyNoiselet
More information also in this old blog post : https://laurentjacques.gitlab.io/post/some-comments-on-noiselets/
Feel free to fork it and improve this non-optimized code.
The Nekoma Void – 1 Less Throne
#Electronic #Experimental #concrete #darkambient #darkemo #downtempo #gothpop #sparse #wave #Berlin
CC BY-NC-ND (#CreativeCommons Attribution Non Commercial No Derivatives) #ccmusic
https://nekomavoid.bandcamp.com/album/1-less-throne
'Bayesian Sparse Gaussian Mixture Model for Clustering in High Dimensions', by Dapeng Yao, Fangzheng Xie, Yanxun Xu.
http://jmlr.org/papers/v26/23-0142.html
#sparse #clustering #clusters
'From Sparse to Dense Functional Data in High Dimensions: Revisiting Phase Transitions from a Non-Asymptotic Perspective', by Shaojun Guo, Dong Li, Xinghao Qiao, Yizhu Wang.
http://jmlr.org/papers/v26/23-1578.html
#sparse #nonparametric #smoothing
'Selective Inference with Distributed Data', by Sifan Liu, Snigdha Panigrahi.
'A minimax optimal approach to high-dimensional double sparse linear regression', by Yanhang Zhang, Zhifan Li, Shixiang Liu, Jianxin Yin.
http://jmlr.org/papers/v25/23-0653.html
#sparse #thresholding #sparsity
'Triple Component Matrix Factorization: Untangling Global, Local, and Noisy Components', by Naichen Shi, Salar Fattahi, Raed Al Kontar.
http://jmlr.org/papers/v25/24-0400.html
#minimization #factorization #sparse
'Generalization on the Unseen, Logic Reasoning and Degree Curriculum', by Emmanuel Abbe, Samy Bengio, Aryo Lotfi, Kevin Rizk.
http://jmlr.org/papers/v25/24-0220.html
#sparse #learns #generalization
'Neural Networks with Sparse Activation Induced by Large Bias: Tighter Analysis with Bias-Generalized NTK', by Hongru Yang, Ziyu Jiang, Ruizhe Zhang, Yingbin Liang, Zhangyang Wang.
http://jmlr.org/papers/v25/23-0831.html
#sparse #gradient #generalization
'Sparse Recovery With Multiple Data Streams: An Adaptive Sequential Testing Approach', by Weinan Wang, Bowen Gang, Wenguang Sun.
http://jmlr.org/papers/v25/22-1310.html
#sparse #screening #thresholding