본문 바로가기

통계

(19)
Monotone Mapping A mapping ϕ is monotone if for all x,ydomϕ,(xy)T(ϕ(x)ϕ(y))0 Reference:Boyd, S. P., & Vandenberghe, L. (2004). Convex optimization. Cambridge university press.Khanh, P., Luong, H. C., Mordukhovich, B., & Tran, D. (2024). Fundamental convergence analysis of sharpness-aware minimization. Advances in Neural Information Processing Systems, 37, 13149-13182.
Measure theoretic description of change-of-variables 1. SettingsTwo measurable spaces:(X,A,μ)and(Y,B,μT)Consider a transform T such that:T:XYwhere T is a measurable map. i.e., T1(B)ABB.T induces a pushforward measure μT:μT(B)=μ(T1(B))2. Change-of-variablesLet ν and μT are measures defined on Y..
Measure theoretic description of KL divergence In this writing, I consider the KL divergence with measure theoretic approach. Suppose P and Q are probability measures on a measurable space (Ω,F), and PQ. That is, P is absolutely continuous with respect to Q. (Or, equivalently, Q dominates P) KL divergence is defined as below:\[\begin{aligned} \mathcal{D}_{KL}(P||Q)&:=E_P\left[\log \frac{dP}..
limsup <=> inf{sup} 원래 알던 정의:lim supnxn:=limnsupknxk새로 발견한 정의:lim supnxn:=infm{supnmxn}둘이 동치임을 증명:Xm=supnmxn이라 하면 XmXm+1Xm 이므로 non-increasing sequence임. decreasing 하거나 그대로거나.limmXm=infmXm$$\Rightarrow\lim_{m\rightarrow \infty} \sup_{n \geq m} x_n= \inf_m \{ \sup_..
급수 수렴 판정법 1. Partial Sum{Sn=i=1nai}n=1convergesi=1naiconverges{Sn=i=1nai}n=1divergesi=1naidiverges2. Cauchy Criterionϵ>0,Ns.t.$$n>m>N\;\rightarrow\;\left|\sum_{i=m+1}..
수열 수렴 판정법 1. Direct Method (epsilon-delta)ϵ>0,NZ+s.t.n>N|anc|2.BoxMethod(i)Boundedbelow(above)(ii)Monotonedecreasing(increasing)\(\)\(an\)converges.3.CauchyCriterion\forall \epsilon>0,\;\exists N \in \mathbb{Z}^+\quad s.t.m,n>N\;\rightarrow\;|a_m-a_n|
LASSO, Ridge regression LASSOminβ(yXβ)T(yXβ)subjectto||β||1tβi가 0이 되는 걸 허용한다. 0이 됨으로써 variable selection도 되는 듯.Ridge regressionminβ(yXβ)T(yXβ)subjectto||β||2=c Lagrangian당연히 둘 다 constraint를 Lagrangian을 이용해 objective에 포함시킬 수 있다. Referenceshttps://en.wikipedia.org/wiki/Lasso_(statistics)https://en.wikipedia.org/..
2d symmetric KL-divergence Implementation import numpy as np import matplotlib.pyplot as plt from mpl_toolkits.mplot3d import Axes3D def normal2d(mu:np.array, sigma): def normal2d_(x, y): x = np.array([x,y]) x = x.reshape((2,1)) I = np.eye(2) V = sigma**2*I V_inv = np.linalg.inv(V) mul = np.linalg.det(2*np.pi*V)**(-0.5) px = -0.5*(x-mu).T@V_inv@(x-mu) result = mul*np.exp(px[0]) return result[0] return normal2d_ def calcul..