본문 바로가기

분류 전체보기

(99)
Exposing ollama port Open /etc/systemd/system/ollama.serviceAdd a new environment variable as below....[Service]...Environment="OLLAMA_HOST=0.0.0.0:11434 After editing the unit file, runsudo systemctl daemon-reload andsudo systemctl restart ollama.servicein that order.
MSE for evaluating performance of an estimator Definition [1]:It is quite useful because it can be decomposed by the famous bias and variance. The expectation with respect to \(\theta\) means the expectation w.r.t. a law with that parameter \(\theta\), i.e., \(\int (W-\theta)dP_{\theta}\)Below is a derivation toward the bias and variance expression.\[\begin{aligned}E_{\theta}(W-\theta)^2&=E_{\theta}(W-E_{\theta}W+E_{\theta}W-\theta)^2\\&=E_{..
헷갈리는 단어들: inexpensive, invaluable, priceless 부정적 의미:Inexpensive: 저렴한긍정적 의미:Invaluable: 값어치를 매길 수 없는Priceless: 값어치를 매길 수 없는중학교 수준 영단어이고 분명 어렸을 때 배운 단어지만 헷갈린다. 오늘 문장 속 inexpensive를 보았는데 값이 싸다는 의미로 사용되고 있었고 내 기억 상 원래 값어치를 매길 수 없을 정도로 귀중하다는 뜻이 아닌가 했는데 내가 잘못 생각한 것이었다.
Map of Bernoulli Variants TrialsCategories 1...n1Bernoulli(p) Binomial(n,p)... kCategorical(p_2,...,p_k)(Multinoulli)where p1=1-∑p_i Multinomial(n,p_2,...p_k)
Minor notes about variational methods Choice of quantities in KL divergenceThere are two choices using KL divergence given an optimal distribution \(Q^*\) and an approximating distribution \(Q\).Since the KL divergence is asymetric, two choices can not be interchanged. For example, if we let \(g=dQ^{*}/dQ\), then\[D_{KL}(Q^{*}||Q):=\int \log \frac{dQ^{*}}{dQ}dQ^{*}=-\int \left(\log \frac{dQ}{dQ^{*}}\right)\frac{dQ^{*}}{dQ}dQ=-\int \..
Gateaux derivative Gateux derivative\[\begin{aligned}D_hf(x)&=\lim_{\epsilon\rightarrow 0}\frac{f(x+ \epsilon h)-f(x)}{ \epsilon }\\&=\lim_{ \epsilon \rightarrow 0}\frac{\frac{\partial f(x+ \epsilon h)}{\partial \epsilon }}{1}\\&=\frac{d }{d \epsilon }f(x+ \epsilon h)|_{ \epsilon =0}\end{aligned}\]?
Reynolds transport theorem Reynolds tranport theorem은 Leibniz integral rule의 3차원 generalization 버전이다.Statement\[\frac{d}{dt}\int_{\Omega(t)}f(t,x)dx=\int_{\Omega(t)}[\partial_t f+\nabla \cdot (fv)](t,x)dx\]여기서 divergence theorem에 의해\[\frac{d}{dt}\int_{\Omega(t)}f(t,x)dx=\int_{\Omega(t)}\partial_t fdx +\int_{\partial \Omega(t)}fv\cdot ndS\]이때 \(\partial_t f(t,x)=\lim_{h\rightarrow 0}\frac{f(t+h,x)-f(t,x)}{h}\)이고 \(v\)는 아래를..
Gradient과 convolution이 있을 때 equality \[\begin{aligned}(p_0 * \phi)(U)&=\int p_0(W)\phi(U-W)dW\\\nabla (p_0 * \phi)(U)&=\nabla \int p_0(W)\phi(U-W)dW\\&=\frac{\partial}{\partial U}\int p_0(W)\phi(U-W)dW\\&=\int p_0(W) (\frac{\partial}{\partial U} \phi(U-W))dW\\&=(p_0 * \nabla \phi)(U)\end{aligned}\]Gradient가 어느 한 쪽으로 몰아 들어갈 수 있다. Partial derivative가 적분 안으로 들어갈 때의 조건은 생략한다.