Commit 3a7a4d87 by Zidong Du

~

parent 4cdf9147
...@@ -179,7 +179,7 @@ ...@@ -179,7 +179,7 @@
attracted extensive attention from a broad range of communities. Existing attracted extensive attention from a broad range of communities. Existing
studies achieve high compositionality through \emph{deliberately handcrafted} studies achieve high compositionality through \emph{deliberately handcrafted}
inductions (e.g., addtional rewards, constructed inductions (e.g., addtional rewards, constructed
loss functions and ease-of-teaching) in multi-agent learning, which are unnatural. loss functions, or ease-of-teaching) in multi-agent learning, which are unnatural.
Yet, few studies investigate the emergence of symbolic language with high Yet, few studies investigate the emergence of symbolic language with high
compositionality \emph{naturally}, i.e., without deliberately handcrafted compositionality \emph{naturally}, i.e., without deliberately handcrafted
inductions. inductions.
......
...@@ -91,7 +91,6 @@ the high compositionality has statistical significance related to agent ...@@ -91,7 +91,6 @@ the high compositionality has statistical significance related to agent
capacity. capacity.
%\subsection{Breakdown} %\subsection{Breakdown}
%\label{ssec:language} %\label{ssec:language}
......
...@@ -48,9 +48,33 @@ vocabulary can express almost infinite concepts.} ...@@ -48,9 +48,33 @@ vocabulary can express almost infinite concepts.}
\label{fig:induction} \label{fig:induction}
\end{figure} \end{figure}
\begin{table*}[t]
\centering
\small
\caption{Handcrafted inductions in related works.}
\label{tab:rel}
\begin{tabular}{llllll}
\toprule
Works & Handcrafted induction & Compositionality\\
\midrule
\cite{kirby2015compression}&Expressivity and compressibility&Qualitative, Speaker\\
\cite{kottur-etal-2017-natural}&Listener's memory&Qualitative, Speaker\\
\cite{choi2018compositional}&Maximum message length&Qualitative, Speaker+Listener\\
\cite{lazaridou2018emergence}&Structure of input data&Quantitative, Speaker\\
\cite{evtimova2018emergent}&Multi-modal scenarios&Quantitative, Speaker\\
\cite{li2019ease}&Population size, resetting all listeners&Quantitative, Speaker\\
\cite{chaabouni-etal-2019-word}&Word-order constraints&Qualitative, Speaker\\
\cite{chaabouni2020compositionality}&Easier to decode&Quantitative, Speaker\\
\textbf{Ours} & \textbf{None} & \textbf{Quantitative, Speaker+Listener} \\
\bottomrule
\end{tabular}
\end{table*}
Prior studies focus on achieving high compositional symbolic language Prior studies focus on achieving high compositional symbolic language
through \emph{deliberately handcrafted} inductions, e.g., small vocabulary through \emph{deliberately handcrafted} inductions, e.g., memoryless~\cite{},
sizes~\cite{}, memoryless~\cite{}, addtional rewards~\cite{}, constructed loss functions~\cite{}, and addtional rewards~\cite{}, constructed loss functions~\cite{}, and
ease-of-teaching~\cite{}. \note{Such optimization methodologies are driven by the challenges to generate high compositional symbolic without induction in existing multi-agent environment.} ease-of-teaching~\cite{}. \note{Such optimization methodologies are driven by the challenges to generate high compositional symbolic without induction in existing multi-agent environment.}
Figure~\ref{fig:induction} reports the compositionality when training two agents Figure~\ref{fig:induction} reports the compositionality when training two agents
in the widely-used listener-speaker referential game for emerging 100 symbolic in the widely-used listener-speaker referential game for emerging 100 symbolic
...@@ -180,6 +204,7 @@ can generate a higher compositional symbolic language with a higher probability. ...@@ -180,6 +204,7 @@ can generate a higher compositional symbolic language with a higher probability.
%%\endsection %%\endsection
In this paper, we made the following contributions: In this paper, we made the following contributions:
\begin{itemize}[topsep=0pt,itemsep=0cm] \begin{itemize}[topsep=0pt,itemsep=0cm]
\item To our best knowledge, we are the first work to successfully achieve \item To our best knowledge, we are the first work to successfully achieve
......
\section{Related works} \section{Related works}
\label{sec:relatedwork} \label{sec:relatedwork}
\begin{table*}[b]
\centering
\small
\caption{Handcrafted inductions in related works.}
\label{tab:rel}
\begin{tabular}{llllll}
\toprule
Works & Handcrafted induction & Compositionality\\
\midrule
\cite{kirby2015compression}&Expressivity and compressibility&Qualitative, Speaker\\
\cite{kottur-etal-2017-natural}&Listener's memory&Qualitative, Speaker\\
\cite{choi2018compositional}&Maximum message length&Qualitative, Speaker+Listener\\
\cite{lazaridou2018emergence}&Structure of input data&Quantitative, Speaker\\
\cite{evtimova2018emergent}&Multi-modal scenarios&Quantitative, Speaker\\
\cite{li2019ease}&Population size, resetting all listeners&Quantitative, Speaker\\
\cite{chaabouni-etal-2019-word}&Word-order constraints&Qualitative, Speaker\\
\cite{chaabouni2020compositionality}&Easier to decode&Quantitative, Speaker\\
\textbf{Ours} & \textbf{None} & \textbf{Quantitative, Speaker+Listener} \\
\bottomrule
\end{tabular}
\end{table*}
%external environmental factors %external environmental factors
Previous works focus on the external environmental factors that impact the Previous works focus on the external environmental factors that impact the
......
\begin{figure}[h]
\section{ Symbolic Language Producing}
\label{sec:thory}
\begin{figure}[t]
\centering \includegraphics[width=\columnwidth]{fig/Figure2_The_referential_game_environment.pdf} \centering \includegraphics[width=\columnwidth]{fig/Figure2_The_referential_game_environment.pdf}
\caption{The referential game in this paper.} \caption{The referential game in this paper.}
\label{fig:game} \label{fig:game}
\end{figure} \end{figure}
\begin{figure*}[t]
\centering
\includegraphics[width=1.8\columnwidth]{fig/Figure3_The_architecture_of_agents.pdf}
\caption{The architecture of agents. \emph{Left:} speaker. \emph{Right:} listener.}
\label{fig:agents}
\end{figure*}
\section{ Symbolic Language Producing }
\label{sec:thory}
Before going to the detail of the training algorithms, we first introduce the environment, gaming rules, and agent architecture for enabling the emergence of symbolic language. Before going to the detail of the training algorithms, we first introduce the environment, gaming rules, and agent architecture for enabling the emergence of symbolic language.
...@@ -43,12 +52,6 @@ Please note that since $t$ and $\hat{t}$ have different length, we say ...@@ -43,12 +52,6 @@ Please note that since $t$ and $\hat{t}$ have different length, we say
$t=\hat{t}$ if $t$ expresses the same meaning as $\hat{t}$, e.g., ``red circle''. $t=\hat{t}$ if $t$ expresses the same meaning as $\hat{t}$, e.g., ``red circle''.
\begin{figure*}[t]
\centering
\includegraphics[width=1.8\columnwidth]{fig/Figure3_The_architecture_of_agents.pdf}
\caption{The architecture of agents. \emph{Left:} speaker. \emph{Right:} listener.}
\label{fig:agents}
\end{figure*}
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment