添加链接
link之家
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接
Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Learn more about Collectives

Teams

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Learn more about Teams

I know this question has been asked multiple times like this but none of the solutions worked for me, I think most solutions out there do not work for Mac.

I'm trying to export a jupyter notebook file to pdf using nbconvert and I get the error

nbconvert failed: xelatex not found on PATH, if you have not installed xelatex you may need to do so. Find further instructions at https://nbconvert.readthedocs.io/en/latest/install.html#installing-tex.

I tried

!export PATH=/Library/TeX/texbin:$PATH 
!jupyter nbconvert your_notebook.ipynb --to pdf

which is also the solution proposed here but the problem still persists. I am using Jupyter Notebook on VS Code which is installed through conda. Even if I open the Notebook through conda and open it in a web browser I get the same issue.

So you know, any command sent to shell with an exclamation point from inside the notebook is excuted in an isolated shell and then that session shell is gone. Poof. So the export change wouldn't be propagated (remembered) to the next line which is your !jupyter convert command. So you wouldn't have fixed anything where the nbconvert is running. Did you try removing the exclamation points & putting %%bash as the first line of that cell? That cell magic says run all cell lines in a bash shell. There's other shell-like ways to provide those 2 steps as one unit, but that is probably easiest. – Wayne Jan 20, 2022 at 21:32 I actually run both of the commands through my terminal so I don't put the '!'. When I run the first line and then the second line I get that the command jupyter does not exist (I obviously have it installed). I think I don't understand what the export does. I believe the problem with nbconvert has to do with which environment I use to run my jupyter notebook? I am very confused because when I run, I get the option to use the kernel numrec (Python 3.7.6), base (Python 3.7.3) or Python 3.8.2 which is not through anaconda. Up to now I used base. – Kyriacos Xanthos Jan 23, 2022 at 12:54 I apologise I am very confused with how my enviroments work, where I have installed everything and how I make sure everything has access to path! thanks for any help! – Kyriacos Xanthos Jan 23, 2022 at 12:55 No apology needed. It's an aspect that is hard to understand, especially since there's different ways to install Jupyter notebook & related software. I would definitely recommend using Anaconda for installation. And some resources will cover environments but some don't. Your efforts will be more reproducible if you incorporate them. Eventually. For the short term, use %pip install and %conda install in a notebook to insure it's installing packages to the environment backing the current notebook. That is new to the last couple of years & is why you'll come across a lot of outdated advice. – Wayne Jan 23, 2022 at 19:14 Those magic %pip install and %conda install commands are meant for installing packages you'd use with code in your notebook. This nbconvert issue is separate from all that, largely. So working in the terminal of your machine is indeed the better way for that. That is why I was worried about seeing the exclamation points. The 'path' is just a list of places your machine looks for software you call. Or that other software calls. If xelatex isn't in a directory on that list, nbconvert and other programs won't find it when they try to access it. – Wayne Jan 23, 2022 at 19:19

Thanks for contributing an answer to Stack Overflow!

  • Please be sure to answer the question. Provide details and share your research!

But avoid

  • Asking for help, clarification, or responding to other answers.
  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.