<?xml version="1.0" encoding="UTF-8" ?>
<rss version="2.0">
    <channel>
      <title>박지훈 — LLM 학습 위키</title>
      <link>https://zzorong.github.io/wiki-public</link>
      <description>Last 10 notes on 박지훈 — LLM 학습 위키</description>
      <generator>Quartz -- quartz.jzhao.xyz</generator>
      <item>
    <title>competency-map</title>
    <link>https://zzorong.github.io/wiki-public/competency-map</link>
    <guid>https://zzorong.github.io/wiki-public/competency-map</guid>
    <description><![CDATA[ 역량 맵 (2026-04-19 기준) 역량근거최근 업데이트Transformer 이해도1 notes2026-04-19RAG 구축0 notes—Fine-tuning0 notes—Prompt Engineering0 notes—LLM 평가0 notes—Claude Code &amp; DX 자동화0 notes—데이터 수집 파이프라인 설계0 notes—Kafka 메시징 운영0 notes—AWS 인프라 운영0 notes—Docker 빌드 최적화0 notes—메모리·성능 최적화0 notes—장애 대응 &amp; 근본원인 분석0 notes—모니터링... ]]></description>
    <pubDate>Sun, 19 Apr 2026 05:33:13 GMT</pubDate>
  </item><item>
    <title>Welcome to Quartz</title>
    <link>https://zzorong.github.io/wiki-public/</link>
    <guid>https://zzorong.github.io/wiki-public/</guid>
    <description><![CDATA[ This is a blank Quartz installation. See the documentation for how to get started. ]]></description>
    <pubDate>Sun, 19 Apr 2026 02:24:23 GMT</pubDate>
  </item><item>
    <title>Transformer Self-Attention</title>
    <link>https://zzorong.github.io/wiki-public/transformer-self-attention</link>
    <guid>https://zzorong.github.io/wiki-public/transformer-self-attention</guid>
    <description><![CDATA[ Transformer Self-Attention 한 문장 요약 Self-attention은 각 토큰이 Q·K·V 세 projection을 통해 시퀀스 내 다른 모든 토큰과의 관련도를 계산하고, 그 가중 평균으로 출력을 만드는 메커니즘이다. ]]></description>
    <pubDate>Sun, 19 Apr 2026 00:00:00 GMT</pubDate>
  </item>
    </channel>
  </rss>