<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
  <channel>
    <title>Daesoo Lee's Blog</title>
    <link>https://daesoolee.tistory.com/</link>
    <description>Machine Learning Engineer / Data Scientist
daesoolee2601@gmail.com</description>
    <language>ko</language>
    <pubDate>Thu, 16 Apr 2026 16:02:27 +0900</pubDate>
    <generator>TISTORY</generator>
    <ttl>100</ttl>
    <managingEditor>DS-Lee</managingEditor>
    
    <item>
      <title>french wash / lime wash painting</title>
      <link>https://daesoolee.tistory.com/217</link>
      <description>&lt;ol style=&quot;list-style-type: decimal;&quot; data-ke-list-type=&quot;decimal&quot;&gt;
&lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=NWmJBvL5z4A&amp;amp;t=226s&amp;amp;ab_channel=HomeByMon&quot; target=&quot;_blank&quot; rel=&quot;noopener&amp;nbsp;noreferrer&quot;&gt;https://www.youtube.com/watch?v=NWmJBvL5z4A&amp;amp;t=226s&amp;amp;ab_channel=HomeByMon&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;figure data-ke-type=&quot;video&quot; data-ke-style=&quot;alignCenter&quot; data-video-host=&quot;youtube&quot; data-video-url=&quot;https://www.youtube.com/watch?v=NWmJBvL5z4A&quot; data-video-thumbnail=&quot;https://scrap.kakaocdn.net/dn/b3br8h/hyYfOGAIGv/5Mb06iTzxD1t05wgYDuJSk/img.jpg?width=1280&amp;amp;height=720&amp;amp;face=0_0_1280_720,https://scrap.kakaocdn.net/dn/cltk7v/hyYfHUYCX9/XfCeKjukg65eEzbuX5iKHk/img.jpg?width=1280&amp;amp;height=720&amp;amp;face=0_0_1280_720&quot; data-video-width=&quot;860&quot; data-video-height=&quot;484&quot; data-video-origin-width=&quot;860&quot; data-video-origin-height=&quot;484&quot; data-ke-mobilestyle=&quot;widthContent&quot; data-video-title=&quot;*EASY* DIY Limewash Wall Using Regular Wall Paint (anyone can do this)&quot; data-original-url=&quot;&quot;&gt;&lt;iframe src=&quot;https://www.youtube.com/embed/NWmJBvL5z4A&quot; width=&quot;860&quot; height=&quot;484&quot; frameborder=&quot;&quot; allowfullscreen=&quot;true&quot;&gt;&lt;/iframe&gt;
&lt;figcaption style=&quot;display: none;&quot;&gt;&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;&amp;nbsp;&lt;/p&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;2. &lt;a href=&quot;https://www.youtube.com/watch?v=4EC1GHQS4Tk&amp;amp;ab_channel=TheMerrythought&quot; target=&quot;_blank&quot; rel=&quot;noopener&amp;nbsp;noreferrer&quot;&gt;https://www.youtube.com/watch?v=4EC1GHQS4Tk&amp;amp;ab_channel=TheMerrythought&lt;/a&gt;&lt;/p&gt;
&lt;figure data-ke-type=&quot;video&quot; data-ke-style=&quot;alignCenter&quot; data-video-host=&quot;youtube&quot; data-video-url=&quot;https://www.youtube.com/watch?v=4EC1GHQS4Tk&quot; data-video-thumbnail=&quot;https://scrap.kakaocdn.net/dn/UQjpN/hyYf4QcLGL/YlHyv3MRnVi2eNh32i8gQK/img.jpg?width=1280&amp;amp;height=720&amp;amp;face=0_0_1280_720,https://scrap.kakaocdn.net/dn/rYFNm/hyYfKc6xBS/OGoUm89Z36YU1ivuFudKPk/img.jpg?width=1280&amp;amp;height=720&amp;amp;face=0_0_1280_720&quot; data-video-width=&quot;860&quot; data-video-height=&quot;484&quot; data-video-origin-width=&quot;860&quot; data-video-origin-height=&quot;484&quot; data-ke-mobilestyle=&quot;widthContent&quot; data-video-title=&quot;DIY Faux Limewash Paint - Using Regular Paint&quot; data-original-url=&quot;&quot;&gt;&lt;iframe src=&quot;https://www.youtube.com/embed/4EC1GHQS4Tk&quot; width=&quot;860&quot; height=&quot;484&quot; frameborder=&quot;&quot; allowfullscreen=&quot;true&quot;&gt;&lt;/iframe&gt;
&lt;figcaption style=&quot;display: none;&quot;&gt;&lt;/figcaption&gt;
&lt;/figure&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;Remarks&lt;br /&gt;- when painting, it's might not be obvious that you're getting the effect. When dried, it becomes obvious.&lt;br /&gt;- damping a brush on water like in the first video should smooth the painting process.&lt;br /&gt;- never deep-dip the brush, only on the tip, otherwise you end up putting a chunk.&lt;/p&gt;</description>
      <author>DS-Lee</author>
      <guid isPermaLink="true">https://daesoolee.tistory.com/217</guid>
      <comments>https://daesoolee.tistory.com/217#entry217comment</comments>
      <pubDate>Sun, 16 Feb 2025 05:46:47 +0900</pubDate>
    </item>
    <item>
      <title>[SFTP] transfer files from a server to a local device</title>
      <link>https://daesoolee.tistory.com/213</link>
      <description>&lt;p&gt;To connect to a server using &lt;strong&gt;SFTP&lt;/strong&gt; and transfer a file from the server to your local PC, follow these steps:&lt;/p&gt;
&lt;hr&gt;
&lt;h3&gt;&lt;strong&gt;Step 1: Prepare the Required Information&lt;/strong&gt;&lt;/h3&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Server details&lt;/strong&gt;:&lt;ul&gt;
&lt;li&gt;Username (e.g., &lt;code&gt;root&lt;/code&gt; or &lt;code&gt;daesoo&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Server IP address or hostname (e.g., &lt;code&gt;213.173.108.222&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Port (default is &lt;code&gt;22&lt;/code&gt;, but it may differ if configured differently on your server).&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Authentication&lt;/strong&gt;:&lt;ul&gt;
&lt;li&gt;Password or an SSH private key file (e.g., &lt;code&gt;~/.ssh/id_ed25519&lt;/code&gt;).&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;hr&gt;
&lt;h3&gt;&lt;strong&gt;Step 2: Using the Command Line&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Most Linux and macOS systems come with SFTP pre-installed. On Windows, you can use tools like PowerShell or &lt;code&gt;sftp&lt;/code&gt; in an SSH-compatible terminal.&lt;/p&gt;
&lt;h4&gt;&lt;strong&gt;Steps:&lt;/strong&gt;&lt;/h4&gt;
&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Open a terminal on your local machine.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use the following command to connect to the server:&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt;sftp -i /path/to/private/key username@server_ip&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Example using your server details:&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt;sftp -i ~/.ssh/id_ed25519 daesoo@213.173.108.222&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Once connected, navigate to the directory on the server where your file is located:&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt;cd /path/to/remote/directory&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use the &lt;code&gt;get&lt;/code&gt; command to download the file to your local machine:&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt;get remote_filename local_filename&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Example:&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt;get data.txt ~/Downloads/data.txt&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Exit the SFTP session:&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt;exit&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;hr&gt;</description>
      <author>DS-Lee</author>
      <guid isPermaLink="true">https://daesoolee.tistory.com/213</guid>
      <comments>https://daesoolee.tistory.com/213#entry213comment</comments>
      <pubDate>Wed, 20 Nov 2024 15:14:12 +0900</pubDate>
    </item>
    <item>
      <title>slurm job script example</title>
      <link>https://daesoolee.tistory.com/212</link>
      <description>&lt;p data-ke-size=&quot;size16&quot;&gt;job_script.sh&lt;/p&gt;
&lt;div style=&quot;background-color: #1e1e1e; color: #d4d4d4;&quot;&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;#!/bin/bash&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;# Job name:&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;#SBATCH --job-name=hilcodec_training&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;#&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;# Project:&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;#SBATCH --account=nn11068k&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;#&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;# Wall time limit:&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;#SBATCH --time=00-00:05:00&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;#&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;# Other parameters:&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;#SBATCH --partition=accel&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;#SBATCH --gres=gpu:1&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;#SBATCH --mem=50G&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;#SBATCH --cpus-per-task=16&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;# Set up job environment:&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #dcdcaa;&quot;&gt;set&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;-o&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;errexit&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;# Exit the script on any error&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;# Initialize Conda without sourcing .bashrc&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #9cdcfe;&quot;&gt;MINIFORGE_INSTALL_PATH&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt;=&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;/cluster/projects/nn11068k/miniforge3&quot;&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #dcdcaa;&quot;&gt;eval&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;$(&lt;/span&gt;&lt;span style=&quot;color: #9cdcfe;&quot;&gt;$MINIFORGE_INSTALL_PATH&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;/bin/conda shell.bash hook)&quot;&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;# Load conda environment&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #9cdcfe;&quot;&gt;ENV_NAME&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt;=&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;hilcodec_inductive_bias&quot;&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #dcdcaa;&quot;&gt;conda&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;activate&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;&lt;/span&gt;&lt;span style=&quot;color: #9cdcfe;&quot;&gt;$ENV_NAME&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #dcdcaa;&quot;&gt;echo&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;Current Conda environment: $(&lt;/span&gt;&lt;span style=&quot;color: #dcdcaa;&quot;&gt;conda&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt; info --envs &lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt;|&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #dcdcaa;&quot;&gt;grep&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt; '*' &lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt;|&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #dcdcaa;&quot;&gt;awk&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt; '{print $1}')&quot;&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;# Check if PyTorch and CUDA can be loaded&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #dcdcaa;&quot;&gt;echo&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;Checking if PyTorch and CUDA are available...&quot;&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #dcdcaa;&quot;&gt;python&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;-c&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;import torch&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;print(f'PyTorch loaded successfully. Version: {torch.__version__}')&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;cuda_available = torch.cuda.is_available()&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;print(f'CUDA available: {cuda_available}')&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;if cuda_available:&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt; print(f'CUDA device count: {torch.cuda.device_count()}')&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt; print(f'CUDA device name: {torch.cuda.get_device_name(0)}')&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;# log in wandb&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #dcdcaa;&quot;&gt;wandb&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;login&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;MY_API_KEY&lt;/span&gt;&lt;/div&gt;
&lt;span style=&quot;color: #ce9178;&quot;&gt;&lt;/span&gt;&lt;/div&gt;
&lt;br /&gt;
&lt;div&gt;
&lt;div&gt;
&lt;div&gt;&lt;span style=&quot;color: #6a9955;&quot;&gt;# Run Python script&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #dcdcaa;&quot;&gt;echo&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;Run the script...&quot;&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #9cdcfe;&quot;&gt;SCRIPT_FNAME&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt;=&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;/cluster/projects/nn11068k/daesoo/hilcodec_inductive_bias/stage1.py&quot;&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #9cdcfe;&quot;&gt;CKPT_PATH&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt;=&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;/cluster/projects/nn11068k/daesoo/hilcodec_inductive_bias/ckpts/epoch=2-step=23320.ckpt&quot;&lt;/span&gt;&lt;/div&gt;
&lt;div&gt;&lt;span style=&quot;color: #dcdcaa;&quot;&gt;python&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;&lt;/span&gt;&lt;span style=&quot;color: #9cdcfe;&quot;&gt;$SCRIPT_FNAME&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;--ckpt_path&lt;/span&gt;&lt;span style=&quot;color: #d4d4d4;&quot;&gt; &lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;&lt;/span&gt;&lt;span style=&quot;color: #9cdcfe;&quot;&gt;$CKPT_PATH&lt;/span&gt;&lt;span style=&quot;color: #ce9178;&quot;&gt;&quot;&lt;/span&gt;&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;The job can be sumibtted by running &quot;$ sbatch job_script.sh&quot;&lt;/p&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;NB! replace MY_API_KEY&lt;/p&gt;</description>
      <author>DS-Lee</author>
      <guid isPermaLink="true">https://daesoolee.tistory.com/212</guid>
      <comments>https://daesoolee.tistory.com/212#entry212comment</comments>
      <pubDate>Fri, 8 Nov 2024 21:26:22 +0900</pubDate>
    </item>
    <item>
      <title>Lipschitz continuity in GAN</title>
      <link>https://daesoolee.tistory.com/211</link>
      <description>&lt;p&gt;Lipschitz continuity plays a crucial role in Generative Adversarial Networks (GANs), especially in the context of Wasserstein GANs (WGANs). Understanding its importance requires delving into how GANs function and why enforcing Lipschitz continuity leads to more stable and effective training.&lt;/p&gt;

&lt;h3&gt;&lt;b&gt;Background: Generative Adversarial Networks (GANs)&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;A GAN consists of two neural networks:&lt;/p&gt;

&lt;ol&gt;
  &lt;li&gt;&lt;b&gt;Generator (G):&lt;/b&gt; Attempts to produce data that resembles real data.&lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Discriminator (D):&lt;/b&gt; Tries to distinguish between real data and data generated by G.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The training involves a minimax game where the generator aims to fool the discriminator, and the discriminator aims to accurately classify real and generated data.&lt;/p&gt;

&lt;h3&gt;&lt;b&gt;Challenges with Traditional GANs&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;Traditional GANs often suffer from issues like:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;&lt;b&gt;Mode collapse:&lt;/b&gt; The generator produces limited varieties of outputs.&lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Vanishing gradients:&lt;/b&gt; The generator's updates become negligible, hindering learning.&lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Training instability:&lt;/b&gt; The minimax game can be difficult to balance, leading to oscillations.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These problems arise partly because of the loss functions used, such as the Jensen-Shannon (JS) divergence, which can provide poor gradients when the generator and real data distributions do not overlap significantly.&lt;/p&gt;

&lt;h3&gt;&lt;b&gt;Introduction of Wasserstein GAN (WGAN)&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;The Wasserstein GAN addresses these issues by:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;&lt;b&gt;Using the Wasserstein-1 distance (Earth Mover's Distance):&lt;/b&gt; Measures the minimum cost of transporting mass to transform one distribution into another.&lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Providing better gradients:&lt;/b&gt; The Wasserstein distance offers meaningful gradients even when distributions have non-overlapping supports.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;&lt;b&gt;Role of Lipschitz Continuity in WGAN&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;The critical aspect of WGAN is that it requires the discriminator (often called the &quot;critic&quot; in this context) to be &lt;b&gt;1-Lipschitz continuous&lt;/b&gt;. This requirement arises from the &lt;b&gt;Kantorovich-Rubinstein duality&lt;/b&gt;, which expresses the Wasserstein-1 distance as:&lt;/p&gt;

&lt;p&gt;
  \[
  W(P_r, P_g) = \sup_{\|f\|_L \leq 1} \left( \mathbb{E}_{x \sim P_r}[f(x)] - \mathbb{E}_{x \sim P_g}[f(x)] \right)
  \]
&lt;/p&gt;

&lt;p&gt;Here, \( P_r \) and \( P_g \) are the real and generated data distributions, respectively, and the supremum is over all functions \( f \) that are 1-Lipschitz continuous.&lt;/p&gt;

&lt;h4&gt;&lt;b&gt;Why Enforce Lipschitz Continuity?&lt;/b&gt;&lt;/h4&gt;

&lt;ol&gt;
  &lt;li&gt;&lt;b&gt;Valid Wasserstein Distance Computation:&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;The dual form of the Wasserstein distance relies on the function being 1-Lipschitz.&lt;/li&gt;
      &lt;li&gt;Without Lipschitz continuity, the discriminator might not correctly approximate the Wasserstein distance, leading to incorrect gradients.&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Stable Training:&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;Enforcing Lipschitz continuity prevents the discriminator from becoming too &quot;sharp&quot; or &quot;steep,&quot; which can cause instability.&lt;/li&gt;
      &lt;li&gt;It ensures that small changes in input lead to small changes in output, providing smoother gradients for the generator to learn from.&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Avoiding Exploding Gradients:&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;Without Lipschitz constraints, the discriminator can produce large gradients that destabilize training.&lt;/li&gt;
      &lt;li&gt;Lipschitz continuity bounds the gradients, preventing them from becoming excessively large.&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;&lt;b&gt;Methods to Enforce Lipschitz Continuity&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;Several techniques are used to ensure the discriminator is 1-Lipschitz continuous:&lt;/p&gt;

&lt;ol&gt;
  &lt;li&gt;&lt;b&gt;Weight Clipping (Original WGAN):&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;Constrains the weights of the discriminator within a specific range.&lt;/li&gt;
      &lt;li&gt;Simple but can lead to optimization issues and capacity limitations.&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Gradient Penalty (WGAN-GP):&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;Adds a penalty term to the loss function that penalizes deviations from a gradient norm of 1.&lt;/li&gt;
      &lt;li&gt;Encourages the discriminator's gradients with respect to its inputs to have a norm of 1, promoting Lipschitz continuity.&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Spectral Normalization:&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;Normalizes the spectral norm of each layer's weight matrix.&lt;/li&gt;
      &lt;li&gt;Provides a more direct and effective way to enforce Lipschitz constraints without the drawbacks of weight clipping.&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;&lt;b&gt;Conclusion&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;Lipschitz continuity is essential in GANs, particularly WGANs, because it ensures the discriminator (critic) can effectively approximate the Wasserstein distance between the real and generated data distributions. By enforcing Lipschitz continuity, GANs benefit from:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;&lt;b&gt;Improved training stability:&lt;/b&gt; Prevents erratic updates and promotes convergence.&lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Meaningful gradients:&lt;/b&gt; Provides useful feedback for the generator even when distributions are disjoint.&lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Enhanced performance:&lt;/b&gt; Leads to better quality generated data by accurately guiding the generator's learning process.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;b&gt;In summary, Lipschitz continuity is crucial in GANs to ensure that the discriminator provides reliable and stable gradients to the generator, facilitating effective learning and overcoming common challenges associated with GAN training.&lt;/b&gt;&lt;/p&gt;

&lt;hr /&gt;

&lt;p&gt;Certainly! I'd be happy to clarify what the notation \( \sup_{\|f\|_L \leq 1} \) means in the context of the Wasserstein distance and GANs.&lt;/p&gt;

&lt;h3&gt;&lt;b&gt;Breaking Down the Notation&lt;/b&gt;&lt;/h3&gt;

&lt;ol&gt;
  &lt;li&gt;&lt;b&gt;\( \sup \):&lt;/b&gt; This stands for &lt;b&gt;supremum&lt;/b&gt;, which is the least upper bound of a set. In simpler terms, it's the highest value that a function can approach (but not necessarily reach) within a given set.&lt;/li&gt;
  &lt;li&gt;&lt;b&gt;\( \|f\|_L \leq 1 \):&lt;/b&gt; This denotes all functions \( f \) whose &lt;b&gt;Lipschitz constant&lt;/b&gt; is less than or equal to 1. The Lipschitz constant \( \|f\|_L \) measures how steep a function can be—it’s the maximum rate at which the function's output can change with respect to changes in its input.
    &lt;ul&gt;
      &lt;li&gt;&lt;b&gt;Lipschitz Constant Definition:&lt;/b&gt;
        \[
        \|f\|_L = \sup_{x \neq y} \frac{|f(x) - f(y)|}{|x - y|}
        \]
        This means that for any two points \( x \) and \( y \), the change in \( f \) between these points is bounded by \( \|f\|_L \times |x - y| \).
      &lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;&lt;b&gt;Putting It All Together&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;In the expression:&lt;/p&gt;

&lt;p&gt;
  \[
  W(P_r, P_g) = \sup_{\|f\|_L \leq 1} \left( \mathbb{E}_{x \sim P_r}[f(x)] - \mathbb{E}_{x \sim P_g}[f(x)] \right)
  \]
&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;&lt;b&gt;\( W(P_r, P_g) \):&lt;/b&gt; Represents the Wasserstein-1 distance between the real data distribution \( P_r \) and the generated data distribution \( P_g \).&lt;/li&gt;
  &lt;li&gt;&lt;b&gt;\( \mathbb{E}_{x \sim P_r}[f(x)] \):&lt;/b&gt; The expected value of \( f(x) \) when \( x \) is sampled from the real data distribution.&lt;/li&gt;
  &lt;li&gt;&lt;b&gt;\( \mathbb{E}_{x \sim P_g}[f(x)] \):&lt;/b&gt; The expected value of \( f(x) \) when \( x \) is sampled from the generated data distribution.&lt;/li&gt;
  &lt;li&gt;&lt;b&gt;\( \sup_{\|f\|_L \leq 1} \):&lt;/b&gt; We are considering the &lt;b&gt;maximum possible value&lt;/b&gt; of the expression \( \mathbb{E}_{x \sim P_r}[f(x)] - \mathbb{E}_{x \sim P_g}[f(x)] \) over &lt;b&gt;all functions \( f \) that are 1-Lipschitz continuous&lt;/b&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;&lt;b&gt;Why Only Functions with \( \|f\|_L \leq 1 \)?&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;The restriction to functions with \( \|f\|_L \leq 1 \) (1-Lipschitz functions) is crucial because:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;&lt;b&gt;Lipschitz Continuity Ensures Smoothness:&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;A function being 1-Lipschitz means it doesn't change too abruptly; its rate of change is bounded by 1.&lt;/li&gt;
      &lt;li&gt;This property is essential for the mathematical foundations of the Wasserstein distance, ensuring that the measure is finite and well-defined.&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Kantorovich-Rubinstein Duality:&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;This duality theorem relates the Wasserstein distance to the supremum over Lipschitz functions.&lt;/li&gt;
      &lt;li&gt;It states that the Wasserstein-1 distance can be expressed as the maximum difference in expectations over all 1-Lipschitz functions.&lt;/li&gt;
      &lt;li&gt;&lt;b&gt;Mathematically:&lt;/b&gt;
        \[
        W(P_r, P_g) = \sup_{\|f\|_L \leq 1} \left( \mathbb{E}_{x \sim P_r}[f(x)] - \mathbb{E}_{x \sim P_g}[f(x)] \right)
        \]
      &lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;&lt;b&gt;Intuitive Explanation&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;Imagine you're trying to find the biggest difference between the expectations of two distributions using a &quot;smooth&quot; function \( f \) that doesn't spike or drop too sharply (since it's 1-Lipschitz). The &quot;sup&quot; (supremum) tells us to find the function \( f \) within this smooth class that maximizes this difference.&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;&lt;b&gt;Why Maximize This Difference?&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;By finding the function that maximizes the difference in expectations, we're effectively measuring how distinguishable the two distributions are using smooth functions.&lt;/li&gt;
      &lt;li&gt;The greater the Wasserstein distance, the more different the distributions are.&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;&lt;b&gt;Example to Illustrate&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;Suppose we have two distributions \( P_r \) and \( P_g \) on the real line, and we want to compute their Wasserstein distance.&lt;/p&gt;

&lt;ol&gt;
  &lt;li&gt;&lt;b&gt;Choose All Possible 1-Lipschitz Functions:&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;Consider every function \( f \) that doesn't change faster than at a rate of 1 (its slope anywhere is at most 1).&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Compute the Difference in Expectations:&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;For each such function \( f \), calculate \( \mathbb{E}_{x \sim P_r}[f(x)] - \mathbb{E}_{x \sim P_g}[f(x)] \).&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Find the Maximum Difference:&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;The supremum \( \sup_{\|f\|_L \leq 1} \) tells us to pick the function \( f \) that gives the largest possible difference in expectations.&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;&lt;b&gt;Visual Interpretation&lt;/b&gt;&lt;/h3&gt;

&lt;ul&gt;
  &lt;li&gt;Imagine plotting all possible 1-Lipschitz functions on a graph.&lt;/li&gt;
  &lt;li&gt;For each function, you measure how differently it &quot;sees&quot; \( P_r \) and \( P_g \).&lt;/li&gt;
  &lt;li&gt;The supremum picks the function that accentuates the difference between \( P_r \) and \( P_g \) the most, but still within the constraint of being 1-Lipschitz.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;&lt;b&gt;Summary&lt;/b&gt;&lt;/h3&gt;

&lt;ul&gt;
  &lt;li&gt;&lt;b&gt;\( \sup_{\|f\|_L \leq 1} \):&lt;/b&gt; Represents taking the maximum over all functions \( f \) that are 1-Lipschitz continuous.&lt;/li&gt;
  &lt;li&gt;&lt;b&gt;Purpose in Wasserstein Distance:&lt;/b&gt;
    &lt;ul&gt;
      &lt;li&gt;Ensures that the distance measure captures the &quot;best&quot; way to distinguish between \( P_r \) and \( P_g \) using smooth functions.&lt;/li&gt;
      &lt;li&gt;Maintains mathematical properties that make the Wasserstein distance a meaningful and robust metric.&lt;/li&gt;
    &lt;/ul&gt;
  &lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;&lt;b&gt;In the Context of GANs&lt;/b&gt;&lt;/h3&gt;

&lt;ul&gt;
  &lt;li&gt;In Wasserstein GANs, the discriminator (critic) aims to approximate this supremum by finding a function \( f \) (parameterized by a neural network) that maximizes the difference in expectations.&lt;/li&gt;
  &lt;li&gt;By enforcing that \( f \) is 1-Lipschitz (through techniques like gradient penalty), we ensure that the critic stays within the set of functions over which the supremum is defined.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;&lt;b&gt;Conclusion&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;The notation \( \sup_{\|f\|_L \leq 1} \) is a compact way of expressing that we're looking for the maximum value of the expression over all functions \( f \) that are 1-Lipschitz continuous. It's fundamental to defining the Wasserstein distance in a way that is both mathematically rigorous and practically useful in training GANs.&lt;/p&gt;</description>
      <author>DS-Lee</author>
      <guid isPermaLink="true">https://daesoolee.tistory.com/211</guid>
      <comments>https://daesoolee.tistory.com/211#entry211comment</comments>
      <pubDate>Fri, 8 Nov 2024 14:25:02 +0900</pubDate>
    </item>
    <item>
      <title>installing miniforge</title>
      <link>https://daesoolee.tistory.com/210</link>
      <description>&lt;p&gt;Example of installing miniforce (kind like mini conda) on the sigma2 server machine.&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt;#!/bin/bash

# Variables
MINIFORGE_URL=&amp;quot;https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-Linux-x86_64.sh&amp;quot;
MINIFORGE_INSTALL_PATH=&amp;quot;/cluster/projects/nn11068k/miniforge3&amp;quot;

# Silently download and install Miniforge
echo &amp;quot;Downloading and installing Miniforge...&amp;quot;
wget -q &amp;quot;$MINIFORGE_URL&amp;quot; -O /tmp/Miniforge3.sh
bash /tmp/Miniforge3.sh -b -p &amp;quot;$MINIFORGE_INSTALL_PATH&amp;quot;

# Initialize Conda
echo &amp;quot;Initializing Conda...&amp;quot;
eval &amp;quot;$($MINIFORGE_INSTALL_PATH/bin/conda shell.bash hook)&amp;quot;
conda init

# Ensure Conda is available in the current shell
source ~/.bashrc&lt;/code&gt;&lt;/pre&gt;</description>
      <author>DS-Lee</author>
      <guid isPermaLink="true">https://daesoolee.tistory.com/210</guid>
      <comments>https://daesoolee.tistory.com/210#entry210comment</comments>
      <pubDate>Wed, 6 Nov 2024 22:42:07 +0900</pubDate>
    </item>
    <item>
      <title>creating a custom module in slurm</title>
      <link>https://daesoolee.tistory.com/209</link>
      <description>&lt;p&gt;The following shows the example of creating a custom module for PyTorch in Slurm.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;Creating a custom module for PyTorch version 2.5.1 in a Slurm-managed high-performance computing (HPC) environment involves several steps:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Create a virtual environment&lt;/strong&gt;&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt;mkdir -p modules/pytorch/2.5.1
python -m venv modules/pytorch/2.5.1
source activate modules/pytorch/2.5.1/bin/activate&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Install PyTorch and Other Libraries&lt;/strong&gt;&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt; python modules/pytorch/2.5.1/bin/pip torch ... &lt;/code&gt;&lt;/pre&gt;
&lt;p&gt; &amp;quot;modules/pytorch/2.5.1/bin/pip&amp;quot; specifies the exact pip to use.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;ol start=&quot;3&quot;&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Create the Modulefile&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Modulefiles are scripts that configure the environment variables needed to use the software. Here&amp;#39;s how to create one:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Create the Directory Structure:&lt;/strong&gt;&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt;mkdir -p modulefiles/pytorch&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Create the Modulefile:&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Create a file named &lt;code&gt;2.5.1&lt;/code&gt; (it&amp;#39;s called a &lt;em&gt;TCL&lt;/em&gt; file) inside &lt;code&gt;/usr/local/modulefiles/pytorch/&lt;/code&gt; with the following content:&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt;nano modulefiles/pytorch/2.5.1&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;copy and paste the following&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-tcl&quot;&gt; #%Module1.0#####################################################################
 ##
 ## PyTorch 2.5.1 modulefile
 ##
 proc ModulesHelp { } {
     puts stderr &amp;quot;This module loads your personal PyTorch 2.5.1 virtual environment.&amp;quot;
 }
 module-whatis &amp;quot;Loads your personal PyTorch 2.5.1 virtual environment&amp;quot;

 # Set the root of your virtual environment
 set root /cluster/projects/nn11068k/modules/pytorch/2.5.1

 # Set the VIRTUAL_ENV environment variable
 setenv VIRTUAL_ENV $root

 # Unset PYTHONHOME to avoid conflicts
 unsetenv PYTHONHOME

 # Prepend the virtual environment&amp;#39;s bin directory to PATH
 prepend-path PATH $root/bin

 # Prepend the virtual environment&amp;#39;s library directories to LD_LIBRARY_PATH
 prepend-path LD_LIBRARY_PATH $root/lib
 prepend-path LD_LIBRARY_PATH $root/lib64

 # Prepend the site-packages to PYTHONPATH
 prepend-path PYTHONPATH $root/lib/python3.11/site-packages

 # If using CUDA, you might need to set CUDA paths (optional)
 # prepend-path LD_LIBRARY_PATH /usr/local/cuda/lib64&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Test the Module&lt;/strong&gt;&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-bash&quot;&gt; module load Python/3.11.5-GCCcore-13.2.0
 module use modulefiles  # register the directory for our custom modulefiles
 module load pytorch/2.5.1&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;test it by running python  and try&lt;/p&gt;
&lt;pre&gt;&lt;code class=&quot;language-python&quot;&gt;import torch
torch.rand(4)
torch.__version__&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;/ol&gt;</description>
      <author>DS-Lee</author>
      <guid isPermaLink="true">https://daesoolee.tistory.com/209</guid>
      <comments>https://daesoolee.tistory.com/209#entry209comment</comments>
      <pubDate>Wed, 6 Nov 2024 21:47:59 +0900</pubDate>
    </item>
    <item>
      <title>intuitive understanding of magnitude and phase from STFT</title>
      <link>https://daesoolee.tistory.com/208</link>
      <description>&lt;p&gt;In the context of audio signals, the &lt;strong&gt;magnitude&lt;/strong&gt; and &lt;strong&gt;phase&lt;/strong&gt; obtained from the Short-Time Fourier Transform (STFT) have intuitive meanings related to the perception and structure of sound:&lt;/p&gt;
&lt;h3&gt;&lt;strong&gt;Magnitude&lt;/strong&gt;:&lt;/h3&gt;
&lt;p&gt;The &lt;strong&gt;magnitude&lt;/strong&gt; represents the &lt;strong&gt;strength (or amplitude)&lt;/strong&gt; of each frequency component in the signal at a given time. Intuitively, it tells you:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;How loud each frequency is&lt;/strong&gt;: Higher magnitude values correspond to louder sounds at that particular frequency.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Timbre and spectral content&lt;/strong&gt;: The combination of different magnitudes across frequencies shapes the &lt;strong&gt;timbre&lt;/strong&gt; of the sound, giving it its characteristic tone. For instance, a violin and a piano playing the same note will have different magnitudes in their harmonic overtones, leading to distinct sounds.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;In essence:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Magnitude controls volume and timbre&lt;/strong&gt;. For example, a loud bass drum hit will have high magnitude values in low frequencies, whereas a violin playing a high-pitched note will have higher magnitudes at higher frequencies.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;&lt;strong&gt;Phase&lt;/strong&gt;:&lt;/h3&gt;
&lt;p&gt;The &lt;strong&gt;phase&lt;/strong&gt; represents the &lt;strong&gt;timing or alignment&lt;/strong&gt; of the frequency components in relation to one another. While it might be less directly perceptible to the human ear than magnitude, it is still important for:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Waveform reconstruction&lt;/strong&gt;: The phase ensures that the different frequency components combine at the correct points in time when reconstructing the signal. Incorrect phases can cause destructive interference, leading to a distorted or incorrect audio signal.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Localization of sound&lt;/strong&gt;: Phase differences between left and right ear signals help the brain localize where sound is coming from (e.g., stereo sound). This is especially true for low frequencies, where phase differences play a significant role in directional perception.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;In essence:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Phase controls how frequencies align to shape the final waveform&lt;/strong&gt;. If two audio signals have the same magnitude spectra but different phase information, they can sound quite different when reconstructed.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;Example:&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Magnitude&lt;/strong&gt; affects &lt;strong&gt;how &amp;quot;big&amp;quot; a note or sound feels&lt;/strong&gt; (loudness, timbre).&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Phase&lt;/strong&gt; affects the &lt;strong&gt;fineness of timing and how frequencies combine&lt;/strong&gt; to form the final sound. Without correct phase alignment, a note or sound could become muddled or hollow.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;So, while magnitude gives you the &lt;strong&gt;energy&lt;/strong&gt; of the sound at different frequencies, phase ensures that this energy is combined correctly to reconstruct the original sound accurately.&lt;/p&gt;</description>
      <author>DS-Lee</author>
      <guid isPermaLink="true">https://daesoolee.tistory.com/208</guid>
      <comments>https://daesoolee.tistory.com/208#entry208comment</comments>
      <pubDate>Tue, 22 Oct 2024 18:51:40 +0900</pubDate>
    </item>
    <item>
      <title>intuitive understanding of imaginary number</title>
      <link>https://daesoolee.tistory.com/207</link>
      <description>&lt;p&gt;&lt;figure class=&quot;imageblock alignCenter&quot; data-ke-mobileStyle=&quot;widthOrigin&quot; data-filename=&quot;00f0fa25-a31e-4111-8101-2ac250794977.jpeg&quot; data-origin-width=&quot;1530&quot; data-origin-height=&quot;2040&quot;&gt;&lt;span data-url=&quot;https://blog.kakaocdn.net/dn/bcfEJa/btsKfSW45Il/lwjkAY5JBH5D8JDaXesu9K/img.jpg&quot; data-phocus=&quot;https://blog.kakaocdn.net/dn/bcfEJa/btsKfSW45Il/lwjkAY5JBH5D8JDaXesu9K/img.jpg&quot;&gt;&lt;img src=&quot;https://blog.kakaocdn.net/dn/bcfEJa/btsKfSW45Il/lwjkAY5JBH5D8JDaXesu9K/img.jpg&quot; srcset=&quot;https://img1.daumcdn.net/thumb/R1280x0/?scode=mtistory2&amp;fname=https%3A%2F%2Fblog.kakaocdn.net%2Fdn%2FbcfEJa%2FbtsKfSW45Il%2FlwjkAY5JBH5D8JDaXesu9K%2Fimg.jpg&quot; onerror=&quot;this.onerror=null; this.src='//t1.daumcdn.net/tistory_admin/static/images/no-image-v1.png'; this.srcset='//t1.daumcdn.net/tistory_admin/static/images/no-image-v1.png';&quot; loading=&quot;lazy&quot; width=&quot;1530&quot; height=&quot;2040&quot; data-filename=&quot;00f0fa25-a31e-4111-8101-2ac250794977.jpeg&quot; data-origin-width=&quot;1530&quot; data-origin-height=&quot;2040&quot;/&gt;&lt;/span&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;REF: &lt;a href=&quot;https://www.youtube.com/watch?v=sZrOxm5Gszk&quot; target=&quot;_blank&quot; rel=&quot;noopener&amp;nbsp;noreferrer&quot;&gt;https://www.youtube.com/watch?v=sZrOxm5Gszk&lt;/a&gt;&lt;/p&gt;</description>
      <author>DS-Lee</author>
      <guid isPermaLink="true">https://daesoolee.tistory.com/207</guid>
      <comments>https://daesoolee.tistory.com/207#entry207comment</comments>
      <pubDate>Tue, 22 Oct 2024 17:57:44 +0900</pubDate>
    </item>
    <item>
      <title>rsync, parallel</title>
      <link>https://daesoolee.tistory.com/206</link>
      <description>&lt;p data-ke-size=&quot;size16&quot;&gt;&lt;b&gt;Basic Form of rsync&lt;/b&gt;&lt;/p&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;&lt;code&gt;rsync -rv --progress local_dir destination/&lt;/code&gt;&lt;/p&gt;
&lt;ul style=&quot;list-style-type: disc;&quot; data-ke-list-type=&quot;disc&quot;&gt;
&lt;li&gt;&lt;code&gt;-r&lt;/code&gt;: recursive&lt;/li&gt;
&lt;li&gt;&lt;code&gt;-v&lt;/code&gt;: verbose&lt;/li&gt;
&lt;/ul&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;&lt;b&gt;Transfer a directory from local PC to an EBS attached to an ec2 instance&lt;/b&gt;&lt;/p&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;&lt;code&gt;rsync -rv --progress -e &quot;ssh -i ~/.ssh/your-ssh-key&quot; local_dir destination/&lt;/code&gt;&lt;/p&gt;
&lt;ul style=&quot;list-style-type: disc;&quot; data-ke-list-type=&quot;disc&quot;&gt;
&lt;li&gt;&lt;code&gt;-e&lt;/code&gt; is for inputting credential&lt;/li&gt;
&lt;/ul&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;(example)&lt;br /&gt;&lt;code&gt;rsync -rv --progress -e &quot;ssh -i ~/.ssh/daesoo-linux-pc.pem&quot; SoundlySpeech_24000hz ec2-user@ec2-13-60-224-245.eu-north-1.compute.amazonaws.com:/workspace/&lt;/code&gt;&lt;/p&gt;
&lt;hr data-ke-style=&quot;style1&quot; /&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;&lt;b&gt;Basic Form of parallel&lt;/b&gt;&lt;/p&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;&lt;code&gt;parallel echo &quot;First: {1} Second: {2}&quot; ::: A ::: X Y Z&lt;/code&gt;&lt;br /&gt;output:&lt;/p&gt;
&lt;pre class=&quot;arcade&quot;&gt;&lt;code&gt;First: A Second: X
First: A Second: Y
First: A Second: Z&lt;/code&gt;&lt;/pre&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;&lt;b&gt;rsync with parallel&lt;/b&gt;&lt;/p&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;&lt;code&gt;parallel -j 12 --eta rsync -rv {1} {2} ::: local_dir ::: destination/&lt;/code&gt;&lt;/p&gt;
&lt;ul style=&quot;list-style-type: disc;&quot; data-ke-list-type=&quot;disc&quot;&gt;
&lt;li&gt;&lt;code&gt;-j&lt;/code&gt;: number of jobs&lt;/li&gt;
&lt;li&gt;&lt;code&gt;--eta&lt;/code&gt;: estimated time of arrival&lt;/li&gt;
&lt;/ul&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;(example)&lt;br /&gt;&lt;code&gt;parallel -j 12 --eta rsync -rv -e {1} {2} {3} ::: &quot;ssh -i ~/.ssh/daesoo-linux-pc.pem&quot; ::: SoundlySpeech_24000hz ::: ec2-user@ec2-13-60-224-245.eu-north-1.compute.amazonaws.com:/workspace/&lt;/code&gt;&lt;/p&gt;</description>
      <author>DS-Lee</author>
      <guid isPermaLink="true">https://daesoolee.tistory.com/206</guid>
      <comments>https://daesoolee.tistory.com/206#entry206comment</comments>
      <pubDate>Tue, 22 Oct 2024 13:19:08 +0900</pubDate>
    </item>
    <item>
      <title>study materials for diffusion models</title>
      <link>https://daesoolee.tistory.com/201</link>
      <description>&lt;p data-ke-size=&quot;size16&quot;&gt;&lt;a href=&quot;https://towardsdatascience.com/understanding-the-denoising-diffusion-probabilistic-model-the-socratic-way-445c1bdc5756&quot; target=&quot;_blank&quot; rel=&quot;noopener&quot;&gt;https://towardsdatascience.com/understanding-the-denoising-diffusion-probabilistic-model-the-socratic-way-445c1bdc5756&lt;/a&gt;&lt;/p&gt;
&lt;figure id=&quot;og_1685925408159&quot; contenteditable=&quot;false&quot; data-ke-type=&quot;opengraph&quot; data-ke-align=&quot;alignCenter&quot; data-og-type=&quot;article&quot; data-og-title=&quot;Understanding the Denoising Diffusion Probabilistic Model, the Socratic Way&quot; data-og-description=&quot;A deep dive into the motivation behind the denoising diffusion model and detailed derivations for the loss function&quot; data-og-host=&quot;towardsdatascience.com&quot; data-og-source-url=&quot;https://towardsdatascience.com/understanding-the-denoising-diffusion-probabilistic-model-the-socratic-way-445c1bdc5756&quot; data-og-url=&quot;https://towardsdatascience.com/understanding-the-denoising-diffusion-probabilistic-model-the-socratic-way-445c1bdc5756&quot; data-og-image=&quot;https://scrap.kakaocdn.net/dn/b4ILmc/hySRT1sbSS/rPxrJ0K5KrxFmlfzvU5rRk/img.jpg?width=1200&amp;amp;height=779&amp;amp;face=0_0_1200_779&quot;&gt;&lt;a href=&quot;https://towardsdatascience.com/understanding-the-denoising-diffusion-probabilistic-model-the-socratic-way-445c1bdc5756&quot; target=&quot;_blank&quot; rel=&quot;noopener&quot; data-source-url=&quot;https://towardsdatascience.com/understanding-the-denoising-diffusion-probabilistic-model-the-socratic-way-445c1bdc5756&quot;&gt;
&lt;div class=&quot;og-image&quot; style=&quot;background-image: url('https://scrap.kakaocdn.net/dn/b4ILmc/hySRT1sbSS/rPxrJ0K5KrxFmlfzvU5rRk/img.jpg?width=1200&amp;amp;height=779&amp;amp;face=0_0_1200_779');&quot;&gt;&amp;nbsp;&lt;/div&gt;
&lt;div class=&quot;og-text&quot;&gt;
&lt;p class=&quot;og-title&quot; data-ke-size=&quot;size16&quot;&gt;Understanding the Denoising Diffusion Probabilistic Model, the Socratic Way&lt;/p&gt;
&lt;p class=&quot;og-desc&quot; data-ke-size=&quot;size16&quot;&gt;A deep dive into the motivation behind the denoising diffusion model and detailed derivations for the loss function&lt;/p&gt;
&lt;p class=&quot;og-host&quot; data-ke-size=&quot;size16&quot;&gt;towardsdatascience.com&lt;/p&gt;
&lt;/div&gt;
&lt;/a&gt;&lt;/figure&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;with the background in ELBO from &lt;a href=&quot;https://process-mining.tistory.com/161&quot; target=&quot;_blank&quot; rel=&quot;noopener&quot;&gt;https://process-mining.tistory.com/161&lt;/a&gt;&lt;/p&gt;
&lt;figure id=&quot;og_1685925490153&quot; contenteditable=&quot;false&quot; data-ke-type=&quot;opengraph&quot; data-ke-align=&quot;alignCenter&quot; data-og-type=&quot;article&quot; data-og-title=&quot;VAE 설명 (Variational autoencoder란? VAE ELBO 증명)&quot; data-og-description=&quot;Variational autoencoder, 줄여서 VAE는 GAN, diffusion model과 같이 generative model의 한 종류로, input과 output을 같게 만드는 것을 통해 의미 있는 latent space를 만드는 autoencoder와 비슷하게 encoder와 decoder를 활용&quot; data-og-host=&quot;process-mining.tistory.com&quot; data-og-source-url=&quot;https://process-mining.tistory.com/161&quot; data-og-url=&quot;https://process-mining.tistory.com/161&quot; data-og-image=&quot;https://scrap.kakaocdn.net/dn/HPu70/hySR1yoTtt/XS55B5WjDUknPBH8jK5wj1/img.png?width=800&amp;amp;height=360&amp;amp;face=0_0_800_360,https://scrap.kakaocdn.net/dn/fXNx1/hySTTrX8W9/zdUhHkTw93EI8cucPvvijK/img.png?width=800&amp;amp;height=360&amp;amp;face=0_0_800_360,https://scrap.kakaocdn.net/dn/bw4wpA/hySTW94c3m/HJkVF12ZCKd9gmDaITR7DK/img.png?width=1139&amp;amp;height=353&amp;amp;face=0_0_1139_353&quot;&gt;&lt;a href=&quot;https://process-mining.tistory.com/161&quot; target=&quot;_blank&quot; rel=&quot;noopener&quot; data-source-url=&quot;https://process-mining.tistory.com/161&quot;&gt;
&lt;div class=&quot;og-image&quot; style=&quot;background-image: url('https://scrap.kakaocdn.net/dn/HPu70/hySR1yoTtt/XS55B5WjDUknPBH8jK5wj1/img.png?width=800&amp;amp;height=360&amp;amp;face=0_0_800_360,https://scrap.kakaocdn.net/dn/fXNx1/hySTTrX8W9/zdUhHkTw93EI8cucPvvijK/img.png?width=800&amp;amp;height=360&amp;amp;face=0_0_800_360,https://scrap.kakaocdn.net/dn/bw4wpA/hySTW94c3m/HJkVF12ZCKd9gmDaITR7DK/img.png?width=1139&amp;amp;height=353&amp;amp;face=0_0_1139_353');&quot;&gt;&amp;nbsp;&lt;/div&gt;
&lt;div class=&quot;og-text&quot;&gt;
&lt;p class=&quot;og-title&quot; data-ke-size=&quot;size16&quot;&gt;VAE 설명 (Variational autoencoder란? VAE ELBO 증명)&lt;/p&gt;
&lt;p class=&quot;og-desc&quot; data-ke-size=&quot;size16&quot;&gt;Variational autoencoder, 줄여서 VAE는 GAN, diffusion model과 같이 generative model의 한 종류로, input과 output을 같게 만드는 것을 통해 의미 있는 latent space를 만드는 autoencoder와 비슷하게 encoder와 decoder를 활용&lt;/p&gt;
&lt;p class=&quot;og-host&quot; data-ke-size=&quot;size16&quot;&gt;process-mining.tistory.com&lt;/p&gt;
&lt;/div&gt;
&lt;/a&gt;&lt;/figure&gt;
&lt;p data-ke-size=&quot;size16&quot;&gt;&amp;nbsp;&lt;/p&gt;</description>
      <author>DS-Lee</author>
      <guid isPermaLink="true">https://daesoolee.tistory.com/201</guid>
      <comments>https://daesoolee.tistory.com/201#entry201comment</comments>
      <pubDate>Mon, 5 Jun 2023 09:38:24 +0900</pubDate>
    </item>
  </channel>
</rss>